var/home/core/zuul-output/0000755000175000017500000000000015155071162014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015155101515015470 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000303502515155101371020255 0ustar corecoreikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ ?lEڤ펯_ˎ6_o#oVݏKf핷ox[o8W5!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kf>|\@E1%]˜(O)X(6I;Ff"mcI۫d@FNsdxό?2$&tg*Y%\ߘfDP'F%Ab*d@e˛H,љ:72 2ƴ40tr>PYD'vt'oI¢w}o٬owko%gQ(%t#NL֜ eh&Ƨ,RH 4*,!SD 1Ed_wkxdL3F;/u7Taqu5Ոӄp\2dd$YLYG(#?%U?hB\;ErE& SOZXHBWy|iZ~hal\t2Hgb*t--ߗ|Hp(-J C?>:zR{܃ lM6_Oފ?O1nԝG?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]Gc O I,״+i̸.^v0nFNV-9гdK\D2s&[#bE(mV9ىN+ſ&}8yj,('GrgTZ'U鋊TqOſ * /Ijo!՟8`"j}zӲ$k3jS|C7;A)͎V.r?t\WU1ojjr<~Tq> `=tJ!aݡ=h6Yݭw}?lѹ`f_" J9w4ts7NG GGG]ҡgc⌝M b/Ζlpah E ur C&`XR JcwB~R2EL9j7e\(Uё$׿atyХ?*t5z\+`/ErVQUxMҔ&ۈt.3;eg_O ξL1KiYLizpV:C5/=v-}҅"o ']쌕|tϓX8nJ*A*%J[T2pI1Je;s_[,Ҩ38_ь ͰM0ImY/MiVJ5&jNgBt90v߁R:~U jځU~oN9xԞJ|dݤ߯R> kH&Y``:"s ayiBq)u%'4 yܽ yW0 -i̭uJ{KưЖ@+UBj -&JO x@}DS.€>3T0|9ē7$3z^.I< )9qf e%dhy:O40n'c}c1XҸuFiƠIkaIx( +")OtZ l^Z^CQ6tffZQ!q/pCTSqQyN,QEFKBmw&X(q8e&щu##Ct9Btka7v Ө⸇N~AE6xd~?D ^`wC4na~Uc)(l fJw>]cNdusmUSTYh>Eeք DKiPo`3 aezH5^n(}+~hX(d#iI@YUXPKL:3LVY~,nbW;W8QufiŒSq3<uqMQhiae̱F+,~Mn3 09WAu@>4Cr+N\9fǶy{0$Swu,4iL%8nFВFL2#h5+C:D6A@5D!p=T,ښVcX㯡`2\fIԖ{[R:+I:6&&{Ldrǒ*!;[tʡP=_RFZx[|mi ǿ/&GioWiO[BdG.*)Ym<`-RAJLڈ}D1ykd7"/6sF%%´ƭ*( :x/ /ow\nT񟜾-[Mm#?,>t?=ˢ>f>\bN<Ⱦtë{{b2hKNh`0=/9Gɺɔ+'Х[)9^iX,N&+1Id0ֶ|}!oѶvhu|8Qz:^S-7;k>U~H><~5i ˿7^0*]h,*aklVIKS7d'qAWEݰLkS :}%J6TIsbFʶ褢sFUC)(k-C"TQ[;4j39_WiZSس:$3w}o$[4x:bl=pd9o^6%rd9#_v2:Y`&US tDkQ;>" ء:9_))wF|;~(XA PLjy*#etĨB$"xㄡʪMc~)j 1駭~բ>XiN .U轋RQ'Vt3,F3,#Y3,kJ3,LhVnKauomˠ_>2h-/ ђ(9Uq EmFjq1jX]DןR24d c̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX?Xa>EE衢^}p/:F?}bi0>Oh%\x(bdF"F 'u Qx`j#(g6zƯRo(lџŤnE7^k(|(4s\9#.\r= (mO(f=rWmd'rDZ~;o\mkmB`s ~7!GdјCyEߖs|n|zu0VhI|/{}BC6q>HĜ]Xgy G[Ŷ.|37xo=N4wjDH>:&EOΆ<䧊1v@b&툒f!yO){~%gq~.LK78F#E01g.u7^Ew_lv۠M0}qk:Lx%` urJp)>I(>z`{|puB"8#YkrZ .`h(eek[?̱ՒOOc&!dVzMEHH*V"MC Qؽ1Omsz/v0vȌJBIG,CNˆ-L{L #cNqgVR2r뭲⭊ڰ08uirP qNUӛ<|߈$m뫷dùB Z^-_dsz=F8jH˽&DUh+9k̈́W^̤F˖.kL5̻wS"!5<@&] WE\wMc%={_bD&k 5:lb69OBCC*Fn) u{Hk|v;tCl2m s]-$zQpɡr~]Si!ڣZmʢ鉗phw j8\c4>0` R?da,ȍ/ءfQ 2ؐfc}l 2窾ۉ1k;A@z>T+DE 6Хm<쉶K`'#NC5CL]5ݶI5XK.N)Q!>zt?zpPC ¶.vBTcm"Bsp rjﺧK]0/k<'dzM2dk–flE]_vE P / څZg`9r| 5W;`.4&XkĴp 6l0Cз5O[{B-bC\/`m(9A< f`mPіpNЦXn6g5m 7aTcTA,} q:|CBp_uFȆx6ڮܷnZ8dsMS^HэUlq 8\C[n膗:68DkM\7"Ǻzfbx]ۮC=1ÓOv$sY6eX%]Y{⦁# &SlM'iMJ았 t% ~@1c@K?k^rEXws zz.8`hiPܮbC7~n b?`CtjT6l>X+,Qb5ȳp`FMeXÅ0+!86{V5y8 M`_Uw ȗkU]a[.D}"\I5/1o٩|U戻,6t錳"EFk:ZM/!ݛ@pRu Iヵvyne 0=HH3n@.>C@{GP 9::3(6e™nvOσ =?6ͪ)Bppًu_w/m/0}T>CUX\!xl=ZVM\aٟ6h㗶E۶{O#X26.Fٱq1M k'JE%"2.*""]8yܑ4> ?tTTXXw9FQ6"ΔBxRpFseL*T,Vv{mxY}SRL-by-a3&(!F)ϋ]n` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(fnc.^xt4gD638L"!}LpInTeD_1ZrbkI%8zPU:LNTPlI&N:o&2BVb+uxZ`v?7"I8hp A&?a(8E-DHa%LMg2:-ŷX(ǒ>,ݵ𴛾é5Zٵ]z"]òƓVgzEY9[Nj_vZ :jJ2^b_ F w#X6Sho禮<u8.H#',c@V8 iRX &4ڻ8zݽ.7jhvQ:H0Np: qfՋ40oW&&ף \9ys8;ӷL:@۬˨vvn/sc}2N1DDa(kx.L(f"-Da +iP^]OrwY~fwA#ٔ!:*땽Zp!{g4څZtu\1!ѨW(7qZcpL)ύ-G~^rFD+"?_h)yh=x>5ܙQ~O_e琇HBzI7*-Oi* VšPȰһ8hBőa^mX%SHR Fp)$J7A3&ojp/68uK͌iΙINmq&} O L-\ n4f/uc:7k]4p8wWLeUc.)#/udoz$} _3V6UݎvxyRC%ƚq5Щ/ۅw* CVo-1딆~ZYfJ"ou1ϵ5E bQ2mOΏ+w_eaxxOq:ym\q!<'J[FJ,4N:=6. +;$v6"I7%#CLTLyi{+ɠ^^fRa6ܮIN ޖ:DMz'rx#~w7U6=S0+ň+[Miw(W6 ]6ȧyԋ4ԙ./_A9B_-Z\PM `iĸ&^Ut (6{\٢K 5XGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?GϧmI C6HJ{jc kkA ~u?u7<?gd iAe1YB siҷ,vm}S|z(N%Wг5=08`S*՟݃*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8GwMm[eG`̵E$uLrk-$_{$# $B*hN/ٟCZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘ+~PR iE^1x5/[O6rpP40ޢE_A͝ Z5 om2p)lbp/bj_d{R\' 礅_}=\:Nb{}IStgq$<$ilb)n&  $uT{wD]2cM(%YjDktByxVl巳1~jpd1O9Á%˧Byd}gs9QNʟ. /ӦxbHHAni5(~p>/O0vEWZ nY3 cU $O,iLacoW1/W=-kqb>&IL6i}^^XpCŋ݃k-$pxbڲ&6*9mg>{rtD)wQ`pkKyt1?[ˋZ5NhfӛŮ Qu8Y4?W֫/&W˸~%pqq{% ?K~,#/0'NZ׽Kq^ėSJ6#j8GO[ PCbʍN^XS&}E9OZ]'t$=tnn&nu [}Ab4 +OLuU{0fIb { Ov8FHӜ"D$aǽO8'1lfYuB!!=?8[Y|-ɬeǪzd;-s~CM>e:9[_v~\:P ؇'k01Q1jlX)/ΏL+NhBUx~Ga>Z"Q_wjTLRˀtL L+BT҂ll魳cf[L̎`;rK+S- (J[(6 b F? ZvƂcW+dˍ-m𢛲@ms~}3ɱ© R$ T5%:zZ甎܋)`ŰJ38!;NfHohVbK :S50exU}W`upHЍE_fNTU*q%bq@/5q0);F74~'*z[\M-~#aSmMÉB2Nnʇ)bAg`u2t"8U [tJYSk, "vu\h1Yhl~[mhm+F(g 6+YtHgd/}7m]Q!Mę5bR!JbV>&w6οH+NL$]p>8UU>Ѫg39Yg>OF9V?SAT~:gGt $*}aQ.Zi~%K\rfm$%ɪq(%W>*Hg>KStE)KS1z2"h%^NEN?  hxnd/)O{,:خcX1nIaJ/t4J\bƀWc-d4M^d/ ʂK0`v%"s#PCoT/*,:[4b=]N&, ,B82^WK9EHLPm))2.9ȱ  QAcBC-|$M\^B!`}M^t+C~Lb }D>{N{Vt)tpDN,FCz~$)*417l;V iэ(_,j]$9O+/Sh]ice wy\Mڗ$,DJ|lj*à␻,?XAe0bX@ h0[}BU0v']#Vo !ې: Z%ƶ(fl>'"Bg< 0^_d0Y@2!ӸfZ{Ibi/^cygwדzY'Ź$:fr;)ٔf ՠ3Kcxwg*EQU{$Sڸ3x~ 5clgSAW"X Pҿ.ظwyV}̒KX9U1>V..W%GX +Uvzg=npu{do#Vb4ra\sNC/T"*!k愨}plm@+@gSUX覽t01:)6kSL9Ug6rEr(3{ xRP8_S( $?uk| ]bP\vۗ晋cgLz2r~MMp!~~h?ljUc>rw}xxݸǻ*Wu{}M?\GSߋ2ꮺ5w"7U0)lۨB0ח*zW߬V}Z۫ܨJ<]B=\>V7¯8nq~q?A-?T_qOq?5-3 |q|w.dަ'/Y?> (<2y. ">8YAC| w&5fɹ(ȊVã50z)la.~LlQx[b&Pĥx BjIKn"@+z'}ũrDks^F\`%Di5~cZ*sXLqQ$q6v+jRcepO}[ s\VF5vROq%mX-RÈlб 6jf/AfN vRPػ.6<'"6dv .z{I>|&ׇ4Ăw4 [P{]"}r1殲)ߚA 2J1SGpw>ٕQѱ vb;pV ^WO+į1tq61W vzZ U'=҅}rZ:T#\_:ď);KX!LHuQ (6c94Ce|u$4a?"1] `Wa+m𢛲`Rs _I@U8jxɕͽf3[Pg%,IR Ř`QbmүcH&CLlvLҼé1ivGgJ+u7Τ!ljK1SpHR>:YF2cU(77eGG\ m#Tvmە8[,)4\\=V~?C~>_) cxF;;Ds'n [&8NJP5H2Զj{RC>he:ա+e/.I0\lWoӊĭYcxN^SPiMrFI_"*l§,̀+ å} .[c&SX( ( =X?D5ۙ@m cEpR?H0F>v6A*:W?*nzfw*B#d[se$U>tLNÔ+XX߇`cu0:U[tp^}{>H4z 4 (DtH-ʐ?sk7iIbΏ%T}v}e{aBs˞L=ilNeb]nltwfCEI"*S k`u ygz[~S [j3+sE.,uDΡ1R:Vݐ/CBc˾] shGՙf 2+);W{@dlG)%عF&4D&u.Im9c$A$Dfj-ء^6&#OȯTgرBӆI t[ 5)l>MR2ǂv JpU1cJpրj&*ߗEЍ0U#X) bpNVYSD1౱UR}UR,:lơ2<8"˓MlA2 KvP8 I7D Oj>;V|a|`U>D*KS;|:xI/ió21׭ȦS!e^t+28b$d:z4 .}gRcƈ^ʮC^0l[hl"য*6 ny!HQ=GOf"8vAq&*țTOWse~ (5TX%/8vS:w}[ą qf2Lυi lm/+QD4t.P*2V J`\g2%tJ4vX[7g"z{1|\*& >Vv:V^S7{{u%[^g=pn]Y#&ߓTί_z7e&ӃCx;xLh+NOEp";SB/eWٹ`64F 2AhF{Ɩ;>87DǍ-~e;\26Lة:*mUAN=VޮL> jwB}ѹ .MVfz0Ïd0l?7- }|>TT%9d-9UK=&l&~g&i"L{vrQۻou}q}hn+.{pWEqws]]|/ǫ\}/J.MLmc ԗWrU}/Ǜ+sYn[ﯾeywyY]]¨Kpx c./mo;ߟRy*4݀wm&8֨Or4 &+Bs=8'kP 3 |}44S8UXi;f;VE7e4AdX-fS烠1Uܦ$lznlq"җ^s RTn|RKm;ԻZ3)`S!9| ?}m*2@"G{yZ${˪A6yq>Elq*E< NX9@: Ih~|Y4sopp|v1f2춓t$ėkm/Ke5#@&4ml8E0ҌlU=xˑdkSmݨ=8G!9R(=JB"*,~D0~`6E)*IHΣarAP%+d@OIىgnXZx/C0ƗǨk}|"|]\ -G8+jv( "Cx]\zh^x`m,pCS 5ԢN+MZ3m@NLBx7,k-ϊj ޫM~uW2,M`~_N]CZ 1"c gCO)g ^[].@%L´+-kOoj/EuS3EA  Q ? ȇ#NB6*E"eCA0GQd,DY$5ĭқ ;Eu:?݊~'GPPF^Q2!ί)}1]ng:-E4+в2"ʓˌOEPQ,,+Ҭ7~.gGɏriM7T6bNEU4l" d9o,>8@qtch^S"z`t,*.mBpr#_,17w!BR.˪&QƒFaA\wtQf$|ULo `}d]Ytk!ͧ(V`y"_@X&`4>" #35$}n(0PsS_du>MY^γ BGY+%qmDvA>9ƾ:,͊QQW?_nYǸD b jF_%] 7j~:+ ,"?t:=?+ (s{udey'eʒ(hռ;~kxȼ&-'Dqy&UK 6#,Sv..Ӷ*'M2tS-@TDڍYv}|D%К-Ld>)$Yɵܙ㣓܈Eݗ,F+ @x,,Sd|7GȂ Xɵ,'KyL䷮"bj(%Թ&c[#DlO>CGtQܶpBG<|x/C&Pg'<#)g'RE=l0A)[MQZVsm(ZAww9v,5SYOx,ّX&gaY#*֞8s(m.KjYL-vrse>!peڅڪ)VTtXMbi>%*Gw4QkQ< fsX΂ZjQ݂FӞ*Ԅj=z*c)πV`Z]=+$u9H#2<ד2ap{(RJk%Gݒ{)WgU9ɚr<5&6W (p:'mFY<\{vr.7>RY]IZ]$iD(eEbdN\EKƊb>J! a@[ΖXU '܏.A-HRD_$Ɓf<'7e[  UY|pfyfL M(2E AmSSej#p &mD^ Ei oZdXx=}F~W?.AV6a>Re[+EӺ``?,ȵ=5&ˊ_圙,aMi2/ c!l߰lzg~2 ځoY04p.|dwMeDֈ78Rkj7+!e+u#rdnI*  }Dutne嗬)Dl,Ҳ4^nZQ;"²69|.Pe YYd UXiU rt)[ba~宊[8=siz]^ҪC(DacNv=ΤsUR.`+G誖W;ūs\ Ǧ'oD>(Ny-_Do:7y¦g|⪖x-o2qU!Ɇ'ZYGg=K\’$a]ipE,$mK$Z,{HWYl2z^VFVȗA&KiIyUbp~&H|-׳gd2}f$҃aO5Vȹ8l[fhl6}A[&jRr ,߻C#}JDR~*UsR=ՔFѭRlү5:qP/x[ R!_ch(uD'+ 8\8 Ȣk(p Ƨ5q#kLW"ƨ;bf8)'u%/T((`q:VOPQ(<ƾSOd̟; V3oCLw@):feJqu:)|pCa@)=/z w-mHXS{DMŤWOT\c¾wal%:{uzXO(). ]an1.5%\`R.cR:$*/c^fpɤz@]y4JQ+0%{,!b 4D8jKCS=iǢV 2c|--5сs~qvt]]藍[c)x@a lnS-"L¿ w!ޠEڀGhLJ6Za@$د 8%5fvDPN[<g8m#%EpR#PPw =Yc8F(rQDN]}ރ'h`}-vPtڪJ7Rh`Z.Bג9bQms{P`&EBv^(&tja(x{jG(B `vK" &xd- B,^[|Bn|=MC(2K`~KpB8:  qm˙0~'8s!qIV)Ϳf*DIPcnGXB[Led}Wuq0KMYS@a?.g~JPB {/=Po"R.8Pa#=lB!fW.PBΎXlPx3l|GlKl!bqC ࣴ}.B[d <[b[y^^NDkqx>m^K?S4z}n4"[ X7 UڕohY x Ji[XjK`1(D_9c͇@)DGœv'#aE!%b%Қu=iBZ XvPܽ ڎݶp++ 2r[6Lh[$"_.MloA+A͖ 1mH4@} 1F(~;Qe) r[[ lMahIN8e/˯s"=N0bA=EIV|Fa*'P;{0\n `gAзX,TˆծGwLEI@Gń}2KI]jt1!GEw"+٬C*q;tKᄎ# [jdBdmPT"p6 K2I&oD"J _.7I?%zoغ@Y9&,>q_C-tKV}yo0 ӡC,woxjLKP5%4ƥm2\~10Td9c3`%KP!=By g (Ƃg~} _n]nbّq8(} G)b6g*yc°NnMk=FN\LF9y \7A'N zlv7˰`:O`YܙC/Qv3zi2_T_~BрL15̎oumZC>J NԢ2M^a ,O: @ {tqB))/9lX:tMK7n@Yy[?c$5zң=hbz>T` 8Ggŭ*"vt؉#C왎Vyd4Rg `|9MyG10j95֌F[0Mn/0kLIQo.fSç*rKDTrou3g>KZiEuհ30XB5/rws.;kos|Ƴ/јj'aBC|cYaIc_! Ӡ7KqgOKQ:4t)҇FF:>0_ԅ'=E3C cogrsJvRjA$ZPk}B6$ZPIjoHNPg B u#ِPg B&nG!zMB5'ێPoCB5kOk4 4X`;B ZŶB/7j2FIA1\"*gyd<gy7f!;Ճ}VF{F]>6G`GH5:'=3eZ$*RfJG/!_ NCM7(YY9Hʔ|FG9$6; U1L)$ 8 dv(]p~vEW][J_>D$0EK.qrtEwPI<<͢&/?Iey輣xW]PBa(_ Zf}Tu9.OC<$8On` 4D0_2:]U]yh XH EPEGF?xPM8b$VtASr.xxTNkBKZA٥N˪~lyVRe5E35e@ZNy|h:>Ma)3X&·Vίo~^)C q2_jh:7)eKw0-~?0*hs$ĕ .`?k, 3od]bVy(,\5L F,ȩ c ct ~_Dq ׳0h"Q6}}#a;3I  )#bSH<0a+x@^Fo4҈`" ]5FhˡTVIQϘ J4j@V Z@{ƒ nGłMbb ya6w&.<Lp% \@<ޞgB@UXJX})"va./pHo3e{'JID5½%!l-'[.e[ 9eA1| e Ujߕ&HP0?JZR!pXDR>bCY;Hi$Y`4^LR0Ƃ#*aNd}tSQ ][4AoANb5Sf/cV!>pDŚ(Cxɫ|Z]PՓW%O~'sIюE;nw(/oI%8א_km"3>6f>_v,e>6B玸03 `ӎ tBrEX+} L9vbur7:F}* aP077OW)n6d(9e.=^Av#>>DFsـge]}.s`-gGM5CP^9'羖 A#(e"c :Ё?!e' EMTI*l <)0u̐YtϏ,ׂvxjs+CIWCH!8WCCtiJQT#))ӸP oÓq;I0zP;DՏհX;`<@y ;"GsVA4CC@} g[m, 3} r4+w W2HMAi ?T(D_-HzkMʛJmGwdCҔȁevÇ | iT$Hx._]0xqo}pG9 ƫ.<~,b `gx[vN:JXМFn UFw\P7-Jo W 4 0]JŘBkP{ Mu\WP^~eXy/2T 31opp1L:Lb6cgj.BOs/o\eYTqsnfM?h8%aX‘HI\R*2Iyb (谿& F[w&ǒ!ZRYX- 3'77_.\@dQ+g$F\AiZb6y/ Bε`i琢(hK;ʤ;jz0t0PAXRP|āl,hla:"G1GE}o @UꙩÒA oKg՞5E{}=qŋD Kt C,$oSe )*WHcfP?}ӱݏ_0lu(KC L?)y>뛓=H_]YoI+~Vn}؇=ϻ@-dd{0}#KX-6FJ~,;X)2</V!oڔ=̾ye0LsvsAkS:;YlsϽžsSz@8mM-!AtROd7?c$;'k4ӬDY;- O 8'Hդ3&vkXrֳtRQm91رW}k9`hP uJ"h Nt'rMd6} B1Q3ΞAg}~~װi4iw3jOr5U*QX.PBY~b$2FسiC_M712U (vInY){BQ͈GW]"n^Y׌P#楘R`X˜0RT^w~ɐKcDC+)-^;X~Hb}|tڼB2c*0qZgPRVM-Fɚwtv%1FJ2U%uudU\m!C#PH za:9ZeI ΀ !d#m˾ޖ{Clw,㴰5 vdN TVxqL d 8Hϒ$wvQI?Q u 2oQ^Oxn4NhP2He'*xܔXq>{(?zJJV RaM~77ƕuD^]/z+лW`.@DQfƄ"Jg1.gIDܙ6|L\QkZle*ˌ͵9lwb$198DIIsin3KcZ 7dtqSzG͘}cR6x{ 5FXy yb$]HR /.w QE9Ds>V3WZ+YK7d H,6]Haİ9$ANE8uUz8iBb73Y8tPH8%XTKB4C~zBj8kHU2LYΰy. #$]RQuMڛ(Xd -XFerl$]m>OI][+jЊ9!:3xS|4ZCRAiFhrQT&kME"D,6oGwtqw$8. n^ʊDo+B$]kBue7q><"=G_:-Lgynu̬q<蒘U t* ȫg(z@FKֿ.pMf̬?ypM"Rx%(I rI| L^|ަ6^[4v18i0zrmʳ1}UT_[(϶ [[vߞn n- 6UdOuטX" &Sqv\H zQMg7wLV+\e/ʅdJ 01r ?Qxeo ){[755T+D:Ykm306Z7΂#KU5Rc/UbrD1Xv븩! ȶ8T|Q3jg5>荑$82 w2,GPO0+gM=b?YjNUy 3T Ʈlb%%8R ,i)RG#$]f,ǯ$8NC8S@, 4&y,vȹ˚دG45s1MSybiނB`p2/m>n:>s6FfrWYR'/E9j@>Baj y "ĚX[  uD1.DY$8l/ Ѥb2j\cZ5(p8b$]L$ڙeu4}Oq;Mxi'@̷֜JJ0iG;|nIJW~pBLYӥd[hLdbdQK qy+bbbHA YrlijB+]=oJ+F7PbbشV~!/)iX[td=hhP^:F@-q'\f_GF|. H(pZ40c-I=ݑW͒D8^˒9z:o͘%.GZ@ܼlDLHY$ss#`o`2OZbX6$pWâ(GE9(zmGY֬hx͹?ÁO}Ϟ5l_E96 ,Rh2FIόb)ƪ >0XTYtOF[ }Nv (ʀ|$+bAYT&ZY!BfIp=/f۩Jx>O -fLq:XF2] .]BpQP,B㖵&K(Oxx;]c[@#xHo#0 1aYD/uv>&!vpUspec$vpP*zLmp`:E>݄aacİvӯJ['P+I֨E !XMfQŹsF+s!ie97n`,.m cڌVU$y?f5(.:c$Y9$_0DRvbY#/=A%r]*H ޯl*j6 e<{O2z,M}s\qb%WC`z|Lhk;,OV^I\Xslp6skhή~Qr>A̼[Drܵ]3qhqp&}NݑZZl-ke,6 E$7=FҬNXiS6\8rp( R.Wl119B?L|w "z{$w!C3a0f?ި~eЇ,= Ԧ :$ⶢ Ի I00a0? n2Ƥl+FXy cs1b~X%d)5#S9aa eNyik#3YPHp6-}:̨SMw= V^^N\1/K] ydKF҅ BKՆNu'q Yfg G@|P{i<Š918H6]}LH-o'^o̴><-fF Ģ\0˙cSxZ7Y>JTZx{oSIrt91\H@1,~,WCa!`85ZfNjN,?3HGz74gzJgc. Bx۬} q浴g=D1 >2bm֯͢l^#dR|4+P`@1e#?3 pƕ E&8jw9F q}'p+sZ/5'LBnMZy?2<<qml ->պBVJ12gN~1`Jk{^ވ!-ƳK'} yO<0{W}Xz #X  tfQ㱧5KP+e5{V:H,%ɂ+xqo?`>?n _uVJ`nW n3wbޚ~NKwaQńL%w ^fe6l| z01&&2>alTP=C0VE5La%{ ]S'cLv }ԩw萒ȍEIY}G;c7XJAܧ}ȞU .,AA}xG LZ+ʁ%Dj8@1)X4`y.Q.0,a>|OUc_ X{W۸ !`"> A03ف$vF,iIi&"&)I.0c[OW?utus e zh[RaES- 92i< (Y|fJǣ~VZWlK,mrv˧<xr]R6$!dŕw l~KۏNHFz\Zt% W~|τMO(Clpċ)| xzu mbccN?Zɐlx)㆚EAg?'@^'aAus[z$-c%۬yU*_{b83\R+F*]#+GY8Z%#MI>a8xhϬD5RƛHK27FbNKlj7'_URWdE`N)sS\Rp8Zww{=Fë|8( pG $C])>@qre>ƏM|CtW7 p w{ NJ7LYa]|X6>8r,I6ʼyqL:MpQ&Mi޻GI];ICy)*^(N"}'56u6>}6d8ɭP4q0O%Twt1fl4Z1YOYzR+IFq1=pQ] x_YErpp0hҏIZGvqpW{5\8ǨQ'߫3kSs~QS%F48r,n،Yp|m/ /Wټ߽¸cc{T\8 2뫏I'J`2V SfR-6<iDcHul#CkGi6 0ڒrj}#wӭHLuD`-N28uh)֏>/k^weьnt@J7h{gM>'K)7irbC0γ&cC&0Ƴ&cY,ՈI s/E\*N<ߣǏ4`_uW0qp_.{xXJp<94wH)ʭB[t%m><6۩j 4O&Fiҥxw#`&.4P$Gy J!SҰ.Uu>d֔#wG<5EhFy[e'ZŖ}yYK?'*h<G?yT+!o)"[RC(̤xmɪcfEMنa|nSxÐkzPX%`9ڴsʭ;Kk/qU:8}p:lHBM;/T3V#"jS\R#!;پl ] |nۋ?([o< }0mUqV8;5^^G䀈e֒{ue^dشSjφm8Z z&Ld=޲j'Z'.q^kJq^O>1Q-ct5c(M֑"EUkɯIHǞׅ''W^|4;mV4wǿnF%/o;.(_XnlIrY",^FzpKoX2PB9UsD{$UOp/ީ feoQ+oy ̀~hƒ֩ LJS8u:ERdr(9ojFێZiFΟ~5:p'H蒐*VeyS[$*S4A |^F ) I 7p , Ւx慠޶V޶'Sꞃyנ Y=aD &:%IDqigR;QEj͖٦v6׊`C/Uf 0m?$>yqv)Y\V8jiz=\ ﻏ(KG } ]g3W_@( j@rRA5k;34Q?v ^x6Bɴ# k"'YkzdǍy|:+(囏#hWՕ8)P݆~x2A]ϟѧQ~洶)u> z`oNnB}ۋAf~[["} ǩA KuTᰰ"ݥ^5%T |h2\'5xh F;53.˱h۬T;@5g%$Vm|W>կyXU:P-Aktk.Gа̮eշbO/a8mۼz8|@\P{%dkj<nmqoO[TZcۚJ4t뛾):P h3#Bv஄mÿ + >Fh ֗LE$D+U5,ìR%0*z R'*dD(eUF ,Iɗ]2IQ喋̖&4ζnz _(޿=[}9DA!UA)H\ = K* Rp|0hOo֠ϐ͟#0cѨLUn44NcbfTlP$)drofJd"k)"ay"$Ev$!^ $XJ]oFbsu˧ȱ1gjnspgH%YEX3g1xWŘWl#vi26ւD9̩N;$ī`Ҝ+FcX ,vd[#q;`bv@Ld(XaTw(Ū3AF4ϝ;xm`=m ҪhcEwqoXj(c=x 6R 7cwuK[ 1[[Pl&01[џP\tG ӟ$hUK &iL2# fH Ҟ)7DZjJykgg>.1w( q i]m!,`E9r) pvk<ǖ $ EmXdqaJVVT COT A,\)2p HWJCǖń>7@WnjSzG(KS 8Otk5Ok (cNCYƩ B6%iSft^ @jbFzVtB~(]+6=ex(-e!ײ {Nz-r1)Z蟎gk8#sVhSM/)ro0VgmҶH[ cH[-x{ bM)2l@wI| pc>׃Ong!EsNG+3rbMd~c-P !AOn$uОjp RP1,5֖ L(BmӸXD@S/NIHEH:WT9-8冤m0Fz>(,x%1A heZy)zɜh%GkZzO!:b2CA!zjՕIbҊ/^1h^s*h\4%eYVk䱕\ 4onbIq(@R:D  HrwH4`:20?N |.8ۏuL`₏KM9mA$ !QT¦ ##MhCLzi $}6"9D?&C,RLSޕ$? F>>y2` q0H*֋LAV7ieT+ [毫)& H ˆX´jGB2~;|uME؏~T3mb_ܠ/5(E } ti( |$zXv6H g$IڋnsWC zf`kᆟA_ t1ű6UwZlڟ;](~w?_[G?I0si:/쏒Ќd뗻·cZ bQ-l].L^7:l}o .99g-*[䟿\Z?Z_5l gpDDvDDPG` ׼SF.L-l6_a4hkM4 aZ/-"0xt2Ͼsb̿{=0z9er #& co& WH_@);`9 99DHe2b1oPs7-VoP>MVl$T5 wu $eIM+ζC]S@,[8fu3NH=螲!V$BbϷl#h¤(MaC2.B c115(Jd?}D>EQMnLzKYw| 9ZꂖO nD/8.)(EE"Au#\F#T)z]L`r꾦eZJ`A}I|݉Ny0Մ&0iBX"&4q {S7P=N2v}_iʾ'qhͺUD:`"h dkG|zW$[AUٺ*429^ 4hh<h ƷR\HJi:)'1ԍTRzTNʔs".t^LzɞVXḫ(,H4ѱ(!P~=#ФueO1C"JHaJ{A_Gt.b5nӝ/"SS8EāINg$JP7Eq t1%# B DrU]'s`n޴y0kl˃ε1C$OO8l%r#7`]7 ㎈LzWDƚc!!z4V=_xW&R&EX0ME tt"mL}09 0#u[Rnp)F[r$`|<L0#nnP!XlWm R ЦRt`E.0Ul=7Fnp5WVl:*( U41Ɗ7QEQP5--6pgkړߖ槝7.&|%5w:0k7l5Q-֤8Ĉry/r0I?ςt8M4 ]X]y]|W]R CcI:͞[%<ǽvwܾediՂI`7Qb*0wzmfBoJxes`.^m끂[~Ӎ +StcಚY˺:Ws$jfZ7m"Z1zeAƵ;46'\uҦ+&M+Ҩo=}L W7B_`\v3F[+yYqPzt2ݺ2]4KE7dd@ss_P>CÎ=¤G㚃΀uKpIKhcd/b JqB(%y$/BҚ[AfB\?5 L]KRU^!$ђB\|rt}-WNtLLpd}H\I.!u)Rhv7hݍ2=Bw%5+xB\h;44!rge4hѠcYxXqMaIj]GӠUX2PF`^Jzm7#be 5JĦ4SVѪlqͦڬޤ}2Z8< \3 rnՋae0mһylv, ɸ;̞;xt>˓#gvmf;-W?[6m)DiCBݛb0lئHf},4;ླ' W"_ /`g$pv<$o1}Fv87fmy 7R𽡭LwMEdY}˘JO[{@9UZX;$Gv].33Q#n*']>Ĵ?rqS|jOWXL_p=}7&M16WqɓUQ{ nEuM/ovl<&sv>&|{SN/ 󫙍sM+_?1SOѴk886{A!o۳th=w"PtٷH|ن6Ӭ0 p^STgҔ^V\e[~z' FRr:Rnk\0qov3z2xמn!ms=Ѡ}53tMt8}Uj훭m ,{k; voW/*2\_ .&e6/nn˅J%&+"BHMBGb."0:Q{4RG?ExnLnKLn>i9frc^q#x h 7(M]'*"5뙖!ȬPl2{tqxl€qdSa8Fua1,Pͅ>RKo.^BtZR=FVRͮ'\o{lɉ1כ5юQ'T!'ls# .A~ S$uܻsKTޝ Z1(~Eq{ 8yZ=ԺT??O?oSq퉲,|V}c;ʃ\漧Wgv 6˅i] 㯽^Ov#xϽ,0"\9%_lHl5FC̳M󫳃 \TYY4)'ۻa4}kOi6\t?$4|h?raBwEְu~0e[D5I·Nhzk[ -Q_.x]" f3[p.'kdtk(4|8ᚷ~ȅ)4ěAô>:b-~0bvPԚe:[B]ͣ91߽|2g;H}[R&#`=`΅Nr,E=w I5'L+$rs'# q "9r:pUǕ$HH4nhdQcE7'\iΏo^>9# Q֒UM0jTJMc7IKp"r`*Li2& [,c^AOǙeF1$iU$%\"EiȩHFiWi5uoyq)1I)=AZ1rR3a:p^jjjr$kTB91%(FLc燘6.7.RlxR'$`bm^O^5'T T꿤R*Jyz=zTpNLjEc6VV/qԬT2!4jꥌ3H*;IeK*m^^ʄx*&>)hk2d!NF5dT"DzW%V%3D)҅r%UNTc666/KVL:?%15)1z3zXRNH tTaBHoczkj-ATl`"׉)%SEkQvRBw7Foc0eL IG*B&X~ZYv^#\tBbWzCWr(['BhOHHzW%kB CU#N1z4z5br]I"!#D!}Nșq4wˢԘ_(y: }3NJ";Vf :)Qb!XwYIY?D:q?/Rg_tv3_ζ_[sݾ9 0{7dB#ə,_Q~Rb^$l~"Y|qi:{͋t(ϗܻ^?'cX}y6 Qm F)m"61+ȍCU!LN]:i>dM# T̀NNÈDWoLQL! /ʍMl'<ۥ5>(d/5 ]\]%Ta#&D-pad{\2 CFb>c9_([kvXK!"FQΏ PZ:?W~{q*ƵHdYkGdsY&qkcp7p@M^Ñ~#p)5ur;ȥVBk/f~b\5y0I$TqB2th4,B)F A7 k1f-6dXFIbJ>p`YY~#pk;|WYGR֬^A kv爙kG֪3g$77xgn o4ϘN\q炢H9H:F=鑃XϦ=ʳ8$xK\t`%FYgmx0$(jo6O#Y1NcB:7}:H4]! ᑵN*H9%ֹ/͍V1Y)RZ{JmhYQA9}oƍogTZ$3cvv+Nq:HbH2?u!gISBc$mI B %E9(7RK,a wƇJ_1q,?SNʿҍ6Ұ8| Ҳ9g9s V!kb`(ZZ+bpqpoMB! (+Ȣ\r(gWCH J`6i Zjd%H1|R:w×Z|y+vt!Jv`[`RH9|1V(/M|YD?12I  #,)v3k+H;md,JQcF+!wQAO|H #Fi7_uFi+oޚ7Gqo-IIhPٜlQbx.ޠ"ӟz:HBe;1-8Raisǟx̆zuStH4z>HaI@) kX]qF(x!ԽF;2]XFEA:u'\F!V u-_|*(8S m[P4QwbNgOgH9ηPnW㷜j1J\\PI+Q3v|0DzDkpU3rښg'ϚICqw)4LN3H1 Br[E&,8D"Tvm\dvr4݌hr((R*!Jnb ɌZBM/ƴ9}6!G:jr[A65Ee& MQjcz)9H9pB]l#سF_.ƯcJ{^e塥+ZwiGrɆ4iq(#|5EZ{ faH),ͩ怆I[D ZtK7[px7􎾚h 6rJ֡i|X*.g@ M֚~#*FȶnZ4/MyCtr^#帹ETH&V l܊q⊧-'KPe) Y"FIFo"r!D׷SHQA zTUWU$gYXU;s3643*-i V0R 4$#Ϩ8 nnؓ1P٢Ӥ787kRo,3f)ZׇBƁ  utUBv+ t r(Pq }M G X%!k參I0֠6F"@rkn=`w H˸D;-,B+1eXf 1:Oz)Cv))PkTYn mmU ^<}ˉ}v:AFs7e)1;( j3҃ɡmAJRwF;g)q(li @qjALiI A* #l)ENZ$QKFԒyc8fL?J q #89zh="zY<3)KMJ6\4F?6MVc1\58[u43 EFTn>37?b74^ҏ//3..oVw܊+n s55nz "P01[ a6M||~z7Ypi 8~5ˆ':f/P*qD--n}Esa5B=TJ} m-<,<(CGBDpɢu YF1B%Z͊:kO:6zuU@V#.Vbv_34,e ߫kjK ep40jKJkbLiUnXi8 JnyZn?#aJkw#{@ΔTbԛb qT!* _ů~!^_(&OZ@򧵽?^ҽ>v>铏4[ Lws_X?}v'#_z7~DWX?d} >_^Sy|mIP 9g;*&VB;YKN.i,ịXP {q`**}VQ\5l=/3DSBIScU0Vx`D%-XӵNis#y FxZ}+GWRq#{@s5~Q<Ӛa|`ނ QV|Q3v`]Z2pv6?"#F.ú?ݺ'swkfq02I:jfi/OS^}RrO/Oԟjnm58]?%~$cH<aa>OFen-tynÄhHi\;}8״!r5_&5Gjz|7ۊ.#W]ke@Ex3wV&2cC3aԻ=H-!5Kah[4iԮ'7Y|j!>%FfzGx4DOʿks|KkAtCl&=F}~Ao(Г ~oC#ﴩ~BK'ԃ>4d6n6qh$%1봸6Mcq5H[,m6”"%;zz7ܩIA,q#~M|md=p^RHs`]YUO{vyVrvs_S{%|Wү&뤑y'F6Ke14DSMr `zg0hV:0T 6%f(-jW:>i)nNʍv5O.n/ޜ,Y»*R,rNbQ=Fd>A ]=ծT kjq[e(X{Fhʈ>sDbv(Q(9֌~mк7v0k7 &6-6`"d +};jR>L#sWܣO_W0_WgY:l2k:A/qB7t _+C(CL@ez{5Ia'(?\D>0X`*@Ñ @Kʒ\~ЏoJa_AEَ Wy/7|3e8,r}͛*mЎWZh [Zh+>,²Kt+ˊ _H>/(Z{dסWTe̛d/vn=F#J1^遉~|u_{Ԟ]:?9ۗ~8؟z_퟾I"P̮IL *L^l4{@Y\ϊMFE6uZ*#բwʂϣy:7n*h<W1Sy1qwlzTg)zYቛqa4D?˗3%PNV+lr @&F6x3| Ir^ulkdc5r)ō '~n.Y xp7 žN4i R]2 5'T"iDhJ)iD`tP\ UO(-*GS}a_=`liƐ֗y ` G"\c]IhaE2j>sIL-xmkok$:1bvegYؖ0)S&9y]è/Gk_]gk\kBuuhbWáXô(u}isD$B0W8t 4̃a^0Tn |UJОZ 7=N_$KHu~y Y)WJmRE HŨ:{;ńR1 DF|LKi*ihUqĘU3޺gE&Τ&Ce(8&HL^@.Ek,s;j- z~cq5@isˑ1&re`xpZbWtC54LPk!Pu9Ő(V>x&l֒ֆZIs[F"hK#Ί\9$e`oorkLLS9P9@4#OVoڶ^{=hg8XKd)0 4*77*rd 9dhv (*2AvX(fWσG|Vܴ]?A"|RKe1Ȏ?a_fbXz62 kAGsΜnN1m00r ߪ΁ 9&'YVXF~V1z F]Zc&4 ZIldi\r=O"ZƏe(pS9׳ړ)DD.)P,( ULYqTQzH"`r8-lL1 nN{j]kׅN6ƳPƯdOQexR´[ 0Ck޲>Hkӗzx^0DڋPPJ:ye(0c r6b+]հ h^@.sѬ$tw~R#ݮNt'I) 5zh~Ҡ"Wwr?40/Τ%PpǭU>Ls,Xe׵5Lg&HpՅY|7stLŔj>wM<-NS6R`%D ˖&U-RRFil$4-Q7HP ]ui@h@0і/{ 4gtV1AgCΐ!mT/'Yu'ܴʖ3JrVA]BIVA]Ix.sCRgs^t׬a 3Iuc WX,_4! W8:79mwpzIu~'y%8QmG;7L4FN׼a- !ﻣXRvN1i4'Mѵz )PD2b`кH`ɡH"\I[e< " fZ*NIo4owڠ b (oͺ3DjJx=qwIxü5@t Ax[ [L;b`\4lCI$n`} %Qw ӄ.O[gpQgwt Z,GKmo eO;2m%P̄~0S͌n+kzDjDVIh3'fYC5QS>:5{jai@QsP*{{?=:?Hp݅$S=S8ǯSv(r P z912x2Nn}2[Tް_ W*qolط k'LFa8Msk4T-*0-++vzx1l,.4Z^Gg2wk݆B42Y8M]= xBUū ,2Du>^`T' {D Bw#_~|NPT.Qf Z8>T{$A܃\ 4:YZ/ aZ({#ʓk6m%D3ݲ͘W[a36yWFu6، s lG)0qsmipUˡv(jPŢt1>]iBJ<Ǚ+wK0  [MQD=)ِQ?եQIӴ6&hs\=p.(q꿫K юxEP~mI9Y^+Jwܒ t'(]]QMPʛ_W߼{Q&jmܪ h:z oO*`a6ѕW *\KwuCn}NGs!6pa˨,+8]X폱[y!J޽{W1Exu 흿Gv5gWΊoßRJWWՅS vI?^0`.?N?Tr^]T dFJ.qhKp_o%,hOO Ӻk)ʃ7Mڒbb?n}y7]W\+e<ܽr4=Yq6Sj5~ (ٌ-OF(`VQ>OeYX8(v,Z&)L !8к S)ᝑ%@ uf [@S:Ȩ d^.E!2,lx&$8C-ݽѧKt>I^aD GVBaiu~\X)$d&%k©!Ujӵ˔fjψ3HBjsƈuIY w : EϿƹ)bS )(LX*^0;Ǖ/C !1 5N)dőRXɨʨ8fˑ $Zj=AsyH{=r(h3Mge.0sO*(+Ef FdRe DS9OФzBeR!Dq %%|8Sn=!1&u/mŌXH܌6f!ri6,ˡ(\<9yf-)>pHz9CN!C3!4sT% Q(nq`M;XYVT! y %rJ0B"7\I9}C!m0j[NP½о~yD+ms `8"T6a(C)i2+2nr4MFݚxb".0 ֎ ѕoS/%J r|=\Ao O:2Ů pqXi T gJQ"!\`qv6!a*x>Ȳ%l~GBYjC<匂ebU8,f5>I]LIAP׭Tήgd0-o1,$Ɨ3'I#5r"ZLPdN)6 ˒qs $uWH&NRLTU5˘4鰑AiPn˝[w1/UUq1oT !pl^aF!Mf9N` g؟6xՂw+[eWw!.qQ + xsupKl#\i)PaJrDh21(yCs XQ|IŢΘ1ʃVsF<@E&rZed^SP>x 32]<,=iPH}l,kЉ(R[`q;2q _XTuNv#P`Z\ź d-Q˜$ sbk"C~^]@,M%C.:T&cس/)T*c.O9ßA8i"!%:]%|̡pB0=Su ,,0E'bzO%l-ѧa3y raz-x _3!,:sGy]6g7Dty{LX_Fe,aUBQCalӈf=ÃqO}wR%[Y\u|j05M/ yHY=yx4vM`ƴ= #=|5lD''j7,JȀ-A+9_PnDVݧE餋Yfܩ S :F 3 RdSbFzd4GSC}BEwڬ7a;+Q|ЊbDtvͨk`!X"&o*aSO(eQA!87F1 w0b)"Z`pCuXM<(ƀ$o7& VnV|ة'5I\AlZ՞KЦg&f2 ͘ 9OהׇG֑F?=k7!q 5oZ5^5tg¨u6B FXCIXʙrai/zBXx.vO3<%7aFDk|Tɘ8Q45(=) FPrm+0pQixX!Xw9jʅrD,0PrAv#ܤ5IF t8tĢS' \- k|A9oۨaGD}bkl~yz VBd3,_:RxFv)oݟ2~9 :Pz=f[`-!jo)VR)rg˯׷'۟z; `5{/xwb?j޼x\G0{V~Nxzr/ޝTޑMMPۙ@DF@nTӀ:Nh{ή:"zuPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uyj@npӀ:~<~T'7PG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@uPG@' DT ԁ`:kyPj==yP*tAV" #:#:#:#:#:#:#:#:#:#:#:#Γu>nz :>n{vXJkuo_&כxF_vOvjooVȭeCUsrŜEqg};׋FPjdwV߹UyMp`2^~ buO%&ȐGŊrb ;^w쉈 Λ'k,D,D,+;X* ؓ34*04(jD,*5pGJqZgF[hQzcW>>?Dto-.r-j}\U;cCձ߽V:V0tuPgF@o1`#:#:#:#:#:#:#:#:#:#:#::|ǻ?ݛV~;Ag~ownK:tomwjw{yffuy1܊?]v;;~FK88$b4,"MD, qJedab  `eg{AM"6O"68jّG vYoe$b YkyD6:D,1N"֢vTz]u/Sd%tV$b#k ]H\!%7XͤDaӌ,ҳY-mge6Nju?ّ}'6z`y ?iHW{y^-_חoߴ]Iݦym_ өo (Z=DKnsKanC~v?{F1bj%cc|?giڭ}@~U5ڼƿ>:~}ۧ'!-Zy}֮jy}&m^z}塚_'3z}j<0{W}_1'`~_:gɶИEE[Udoxl'~:Yh$b -Ht L"V[v$bɒ';/,6F^M뗭l7C"{X6b۱? ^[fWr_/7Hk{.9B8=xɯ ny(kE}\ZǨuʨE|J'utܿVKh壞(\ٱpp5z?Kڨ|*up +g]޵6rcb60,0;$}-Kj|J*dI]*-LT!;!7ӛ|=iC8o_].GeFQ;""RNuڏu.\T%DWge%Fj2#_]~kuH8>:{z9շ WȎvO}Wt;{xwS: 5+0W?- i8O {yUiu>z;J;*7Ѹ6y.*; nz_fN`.j7`5^XwI| s=<{&ZঽEgxizZN0~܁SO XnբNL0-:ҧ꼳d<:jHg<~d5W$?i4k:}j:/O1mkyb[vB~WTssO\t8P\A b Vv\O1(Yy8i,=,[ի愓Efy0Wm1W_Jۨ&SOW'SZ N8mɶ/xg~:.&-FS]2hc*&h #6bW{N}mkW9q'ۙYʼث ~NpE}t~h}=w>ۜ#'N;@G}vȄX3jNkŰ/I-pu A~ 7e9ͻ;bvkkҒѵyZX57?LjT/Ou^qd+ۀ>ҒإAv4_~ŮԦWL絖C\V]&#PФ"Lyh2`~x;y;~z3'xk 6Nqr!{G1̝Sɖf2GѮontGc>mEhj+xm{ݒWb٘}3|X\ָoخ^~-mFN5Sh'ɋͪf^n?FU]Kq/T5G0O]'ipIƬzX9@<]z)khC=|@= J BՌ"F՞qn7c+9pyfl_cۺp#4|Q} {itn|4S' M:LlTzf<;DubCFH8</֚E7"_aq53JJm4]RzTPމ>-<}YgzkikיZn;-?WFܺ홭&BTjN#BϹܤ7)*L(RP^=:N^atL`Rc CC"s&aLbbAsX g*t*P'ogǼ:or֞jhKa_Kغ̫`?3L,?vݞӟbQJ lE~gi bMgH;COg VSiR7lW}> Fym9>"z9Lې.W`ժhȵC3 <,:KA5_2 ۟S&azDhpHxY  r} yqC;)C>Բ_af'qK8~ϭ+>毒X|*4vAfhC }C~C|CH[Tjc= K"׏^>wAJy=Ps(9@J{=~ьG2+% *LOpбnIܒk+^-IP)[[Z= w qrQuc2ܱm%z1Z2& 뙱1$MD&P O'O9R%fٶrlos6zy+*P'OO}'vOgq7ST/?#k.e$$0 ( ƍZHv>x4""v\cSǂiϩ".DXPq*FRiEgU[9Ztqu9n?lrFY.t6U|@fIӉT9%{Ko}w4䴳FFC4:᭶&H\Ҟ-~C HcN=Gp=ADoVt&O>nISJv}Md"Jh 1Q[/9Z9I?c&0@7Q&5.pEpH c,-!~|oq=[N[6ϵt?vk}uSrHx{!!7Iե(g,r<В~z L_lX9{P|ᯓzիs~Pp7_'7`t|LJ4?FåHQ(4JkY) 3]o*R"я?|_ۋ~kq24T$@gvY+c%zDXʒ.. U!i2ds=q &)%5*yF(y_|U~NQ1؉|B&JE-c'V dt(y_ȹ>}zۯG<9JM SF;7΍EB.̀Zhy V,1JV0`OC%eO+Kdt /Krm^US L6TIFuzXnY< OA6iԇcTȱ^DBѠVg uu(y_T. p|DM :mpCD4FlRd+>Z汯%B:% :-q>Xc-J[gy3J!jJQ!xn^A\qYKr2v>rQ2^4z@6f;^}E:IH C) &0MTLyiC X]"vY~` v @>fG1QH e9 XH{4e*^ E> 3Df֐ i $kݲ}僮~;m}΁,1 .0<2G1`>C{hwy dtwyAO G11[Z$UIcBО2--Ǣ}!gdxzBX%%8hG@LLƪ6Ư@F7Ư(y_ȹzQ8B; 4RG8T9$ZM}8YL(y_t*ǏR MD8QdWG4Dq6Ķ[EbOWn&S!dR%H/̄  'Ho]'(yOمO'A8nf%ACg0s@S_AW/'t@g>=AMBP0s*ƈRoWW% ΀ąZ61yPZ>  2:Q+u==kc+IVNY/_'7Ff0xUHSƜV{6Ķ9Ebˆ>O=!KEV8btQ[h%tjIe{*pxv Z-q8j ԏ^Óy>=&F&vTA {Ɂ bƧnM}[5/qj8|^5rZQEDBC͍gtSxD =ZTT To>vY]yX*\k(Xh|% /yEC@6=az ZBRX!Li.t.t% 8Bd>;/03hLϒs^+S hOm|un|uQ+>1i}8ȧ۔ iċa| gyh09ρIk+8'ZiZ'*3vO"qk]Y8r+y=؇xñnx]H*u$:uR"YJE50Ud% /F2K >Q1+in"td"QA~< Fvl5bF 2BbպR Kz2Ċ2hC fABmʝG*شdM cT&<^VcVn=E|@kF~|@3  ͓j$Ǚ SwarɄ$uUu=φծ3B -bEn%2ypjS,W;%yȇٌh-¯sh<=D%4i }GH C"VMwѣ鐑rČXW q/+eĈt@6-L b;ų@Ƣrc2R q[xI$skd~P2ᚒaC<"p-8G%&$<-cNOk#HRn91ɂq9$u).&NX\E2NCyaAȇ\3 Tnz@62H<"אڎ$܃=AănOC;INr9Y "d#.o)`8,uD>T#FY_}G(_1$TnފG(]CFM3Xd5=S4HBX9 xuz쒎T5‘J{% ))Rߧ=P@ 'iA'X,ia%__U\ χTn7+j9>Ue~K H6̒ @kh(N{'$bq py%8AwZE0 D 5E8@9 cі^)rYjB>PNj-=FfKnznE:\D *ɺf^Ʊ\^"r:Ǎv?On$4nͦj-c/%T-8ZTPK{-jؚqkyMV 9EH ^okY^yU l>+)K$U,@M+̇m7Lx"ZAxYAU 5wHZ,$:8:xܹ"X$˄\d2Ѻ"]1ٝO_{ĠAe2.1B `/BvH Rx#jvz2̎1nƾfsԖ}lG?tfO=hdL{³c$ ̭o ! !+e<ؔ㚂} ]E5uI9b;}S/@OFN֥+{ X JT=+<"<Ysf32!^QUmzϛpaw hLH5"fMMKH0w1ƱPb쌲@M=3= Ckaa Ԅ&W2R4uęEqfbvHx>#֜i8/&h9)>+A]uʆܗWx*$7!#$J?C}}b®RշQflg^TFe/388l@yʭfīqe*aR {Te\*H\ czsg;m56Ɉ5&|")Ly:MwI.ʭ0')EA9%L23-|  >:q-N<D ql𨷃ȇ/ER!1٣R_gp5|\ (ij[kY) 3,KTN3ulP}J1[ʜ̸(wJ/f>˭f,HV 7'gɮ}p,8 0T$}9 *ŘGG3imW#LH)Ƚ<Cȇ\BiiEJ%/&L:+p^"EP)A5G \T-',%XDg BjA3ZTd2˘PyjL i*V!#J܆$x=˨ Ѷw:k៲'Fh` V~jRN#O/ `q{SqB`됑0 #~:۔%=aY\G0+ԠӔ '?V:J} ?aM:^ci<39u8"FPfQ; t +&HrYv_V/}^ 2PF꤀Aֶ3yfL-Z4؇WYz-Iե)8 ƬX \X D>LUWƧ-RG3kL̐s;3$ bj-;Z 䏋{tƗɟDӢu3w; !"6wH8*6V*m Fae="K,K&Q7DqhD#!nIƃPڒ B+ #FsN>)ϛ(ۥz] Dܵ]_rʇopd=]շ|Vԇ+"OI, j P="[kG&f[H~-9Zos\ylk op5 ]s}>;m}ޑu4e\1MyrǕw^=88Eu J [] !VM9b̝} uNuO>~:cV9&6cΉ*\y#CROs+zW5_V`f 1*߽=rij]Br֣I fe_lT`¿Р.%coyja%O=To>w`m"0~ڄz\Ç"lÈZ?ҏ;DOƋGM<\N?Nv?MzדEWw7-&僟Te1٥:Ȩnnj3/^K@v"1W5qp8T8)N!N%0GtA\SBKo@vE_sd bܣԁMչu|5í} ĝ_Mxs/Vk$W_ǝ+xC\$kUP6njTqŤ92U 3a .ĥ 5/a0S B 4,aR&=) m/WdA Dy$G?H14޶CiE7ᕜ[C +9qmGpsqU]ev"sJMl;BWD n;ݾ]VTP n;dudkӪmmV@Qb`PGTj2tL찓ٶp>;zb2F$4aR8|Gl0#Ipbΐy8/8Qm"sH<_ACU-qzHt',RziA#B]̇Z+6ݾq&UtSG;zi=S2E*g#Κ/`q2\](cS& ~a<]aU X]XJ.VwPyvrATIBj,)yjeO "r#43a̾FVpuCQ\ MBs{0#j`-=8]Oݣ! pB 3fU g/2irĨ{Im7fPW+ ܪδ ,a]5X==|ʫM0Ĺ4N;Yxڜ,qXyAo:[[]Wl~Wcy%vz\_۶ Wꖶzܸ:|[3$9L2YZۺUel "[;ϗ^ϼ{×:r0e<V*Yշj2? އ5 ^ǓW*tsU(DN] ak&$M)ѭk.CE<Z3ZwG5>=`[*D0=aHQV Ǜ7o8߉ܼܺ] vCq;lX3k\}1b5c#n[o/p b>ͪ舫~^TVkG\㔐fn7pJX q3ٌ"\g2Y1[6 JF|p b˲tJvZT_󃯑rmP +/1"Tr&C )L¨vv!Nv<,ѭ=չu!lvo#WTuVl}5f+ȗxMcĖۤ|u~5E*P a E<+ 2+Ęr ZL!Cm=- ZOCDw'fzq#Tv8;lzUU 4[`64#7C @W{mDgOz3 mG}@;B n׸GORn=P-5!!3sё΍j;s_GZt{>0">%ttWơŬ|IG:^}\>#ǿ" [lj\Iȇ%EڣX4=3ch=.Y\&0l5EV*"c^x$i*Dːc0%Toш6ns CW3}جCjD jWi= )\ KP$) rTPAԩQLɸ"(G ,EWZ^m?JA-H]S?NVgZǢ%R*a["J FY"SMj2P^ETc4wFwSX0J#-E Dvl6ϹUo }v#4I gS^ V95TY˴liv45DQ9$2%EDOԖE!\QD׭@H⽔LmU&Brp* M?V!F+ Pox)R'9@C/MeP\W~ ۾qA9 K;Kw5Hl`]e[_ӫ=m)?BOk'smOl7NKotaX %`P+/FW4[:bERą)L҂PR6[aiGmR%e>*UvRC ';uSxqg) /6Ǜݹ\yH'eٶxqp!nVmd'Ћ8yU< 87i9u=^ϫnNj VuZgUJ{ ^zе+]n-"}zƤp%ee-4^1JΙ˗^Bzˌ3\N4Yߊ܆y 0 xЕHj/Z&-_ WqO,KlK$V})G)rj0x՘-?ҏ&ayK{H=)QJ>fDfiug(O)$&Joa#Q!e(-Uϻ tU%zM~m>'Wny}^$>㕦T:ʒKKc=V볗鹧âL[+/y҄?k m =E}JzEG}.Sx*7I!A Cf6(שB4[Zh?#-'3AS${tԌ|qi dgC'Hʫs^q>z/ޗ2l8yc{%akghr)wbE`ڂˁs2{)FMsK/>WKωP%2հ^(qSdES\m殣;-@;ñufAaPM`sw|&_)VÒoooa~de_55+c(,Yk[uU}'hw,KsqVTvK m smqTv~{_hd̎wÛ5D2n9_7};ovTÃt̸w hkۇxP,~Ex1V\Q?/gy|aAQ?y,6'܏\xk?q5xw1StbNWR!/KN&ă&bb 7Qq ygLX<GTiz`v5Ճ$4V*nj=?0 C E%WǷ^q+No0O'Bq<.-t#淁8]s|Tl~_'7i=њpY܍#Ѥ\n2ނ/S6%b=rp~G؆WH.b/ɚ˗xxƣşӏoeʙm.{v8./RQ0G@Kaڟcy]0ujd%t2Q Xf1\pwCjL۽xަj+y G-H-(.V Ln .{&e1e_`H &;3 bm ?gnQ(ShD\rp}JIa`} I6)7Kwivԥ\)M=:9fpp@F< {ܽk1sN0sNIg(tɜkg"f)]LNyѼzJ%ᴆU#J 1);n.9>JR,:@0,H$9L͜ #!od1do|ǫ˲I l8b *a]B.i!6١76âݡ\N($؂Z`.P nrۉD%1ƁH 3JBFuWƒBkMHycG$z ) qv|c[:}r!Ιə*|ha!1Ru QmT )1j1s+Ȳ2I ͯZ*9E`tG |5veAj->?춄YRE8<~lWO2ē5%Di3V|-Y"sd'Vb5viNadd͆^,;*oTa"B5FG.0b#ʡfZѪ. [Vn+V}-O]TӾ5e5Nbי$󢡌bph P逗Bn/WZd Aoo9ECahWIQ`P8FKƅ87vvذڵq?a 2[s8QRti lTc/QJGs mՊL.lحrZH/JzYm۠*x_?M桢z`[khѱ8{/OJr?R3u1Mѿ,"my~c}v^M7 b̡cKα*>5kO |{$$H# /C9 2N+`[ׄYn5&&3HK\vǑ{2qNȩe<hu?K+?JڣA B %)kYt|Ox:[je8:Lv0΢v31qpe)؇qcDwv8J ` ,^~iŵLsd2sjQ>Y%,zYG:;_:hz7b}AhcmݰA YyO[)3nQt@5x愔w̜ZkmЫVkj5sa1ug9_WlF֍nss%Z<G1'ml۸=ul9HCOrY < <MV g-L]{bJfdYOʺ;\Z(c3% oكzw pXanbJ;H8hjuJqE0Kw=c,kw=M9_kuiDx9˶A~s%l/ݮ?1V=f=3i)hȤK sOr0WgwB"vR;$ԥjnRDŔ9N@E:{[`Kg/nh o{.OkE瑨QFsNYQj{)5lG/>?VC/$Co%,\8Du4n<Ǎ 5H `Cwe@nN*kqꍬ 6>8Tx!+ cs #`K%FU06R>VA(?ݣvGB/DCo%,xa)idM۟ct4 ;=vfOKu[]*M& bBEB럠NLf> "["XFT*c&ˇS9),Yx0<HD1p%[aAc朠}|u@2e#X0JHlVZhtCO_.cɋSC:CEJ|T0V,htNN_c{uHc07**s3\{ (_cZk; c.L;`Ql-2yQ砘p[s" ?7_֓r*h[3u>wd2Wٳ594=)=~=U\Ux-J+v[XHMf/pczX{ƍ_ [f| 6/|G,ͪ{+vKjԒ8c [qX]bUXEi5i1wi=_}A"~O1Q{H{]Ak63c3Xݦ曪X0t/ۭħs9 xkl;* =g8 G^ch({+i^[!c96Zk>`%\crָ}\kIր@o0\rI>YMZw޳ 3{H\[Z12<5(8@J-QS|tn:ᤏu٪ }٪;Bv‹̊+MNôCydۗ>iju-JnUc(jzsY) 3b{zM52Y|PZ6o*hɲsP> E.^u1l0L\Xc{ckjnfQl!Sty"qlh`$,N!Vy#"Ty/9HZXj)&"r) `5p0r[YV=-;gĭl K 씙嗕ꠋۆCJ*i LMVAߍZSc YN]k/WC>+9w {57͛6Ԧm4HYO?|?,C\2Lۣ_-T>n6|ySc_Nue?a@;~H;ࡇ=`@O)&aKƃ/4u0qkqY7w#c>X,z!sq۴n/;_}Uo@W6i.NfX)A+u8R˗/o^mޜNT؂mxng]xdၔe ogv޼ԯ}!&+o"(!n'?.kP(n`?7-@I'Aoa׽#E,cb ;V~*G3oC?T~~0̨b`?v9a:'``¢wڮFlm"V&zL%3R),Es Kpauu }u)DRK&&O[ydrBj/}xzۗ|$n!yf[9hݾ)g KlߔFW֟j틏i %rQOvQ./?'NݖCwgH( /~ptqf1.Е~V[Arrwr>p!tq&GVB'$@³z1B]:ٱ䫛W!ֺs.% S*1S)"H|E{~XA\!eY2Xrj>Q/7n\G]c~?ucgo}cewϞcZYJ3RkUlIǽx*r-,A q x Π @W.~}upB3 c򸃷͡6>9CLjL;rES7@vp(HI=}njdg [|[4GJXDTk9(|:IJl.4GuQțaT_ŖT~YLE V*2=ڊ(Ou0t}}3}}sx(_1\0 l88J)%!DBX)%Zٳ,@Ncɶ4q9e`TvRȽ|}AWa2F8EdsYr#~&49Ae0L(%%CO1 &&y#lR)[AB|buΈ=r//Ӭh3* )gǭ0"KsZC<%g^r/a!H8 |a#3Ldkg"fXHRLG_l#XтUe `x Ee*°6޹wU:uy |}y\#c ВWu ''-p8{z|}U\vdQ" JE#)8Z8K8c=ߪ_U6G;u:8ps^Yv'7]6G;uE>"8pT 4궾gu~i gdQ̅DS)QSIſ5`HdH|.S&c 嫾nU4SѼ3jPF)cv2U5zi5z ^Gd>x'99E\t-tbdb +R+)aZQ32ܤ&*U Mucg t0U,=[snؚ|I)~b ]9p$F߁2(5K_W'G|}uiv8\<,xVb@>gУwymxo\1U*:0f3sOZcS\z0kjl| pq5H(`#A;e GYEm !7)HoNrr}7>\.o1+~0,vr,(βuq_蒷}Y۸.my[oaaͱ1X󕌆GώB+h(4N/'[=SVD+dɄDiBNr% 7+ۀMЎ{U)|T[|ya'~mr"w>젋U1Z0Wgo_0G=g^zvE{Pcfr!:S1'\ȳp^S-sH+ނnZ ps3,+#^ h /'R@Kgjt|'Ų nR4Ԫ:L,ʩcAJδNJB 08%FY$:`q t&ïu.fuyno̦x˖Hoܿf%4w Wug)SrO]leWEMmz UAv~s̪)Z!(Ӊ%9Wż*xjYJQX)Nct@s9N2zI=1c{ӵ/V*6l3W/L*%i4ۯÉ&)jTwbvdgDksBd9eNsm67<)Np6.%~C__*_AA= U?Sտp% ~׋Ù K=Io*-K"|yjI: UUlqXtDVGQiI+DK=;iE`Zs"N)9}̛pH? mڠU'cQۑ؇z- bnS'‚P1BA^ *h1V d.W:?޴>oppnKG(.dFe$UhUDZ{)5 ET[,AVxa,CjŲ7,>*x&e9$#ԹRv {5 ~&AtyAw=fvR:R_*ssQcqwr>plvsQ8)+ce Ot}ܮGH 6+,5 *s鸖_7/97Ђrո>ȑ-ߴv (&UL,m{i 3 Ƃ `5tT ʑ܁GPf9s`9\$U@9L&HQ~'3_Yi9&IM4{X8y'mMVzrF0) $weF~ / 4N4V_Yd%dɲ:<| \"kc;m78Zd@Qlt #it*JއÁy~+%K4CJLNE87r\7iEE@_oA|SN묦Ě3&"%SD8pǎd_M;!Ԫiil8ϢpYdQJoZM!Еڀ#LB9a# v³/|~R&Kԕ'0;o~Ljz%Eo:%N 0cN?O["o#t{:QPR\O\䴞lN5*&ӹ9Fhb~C/H\7 $x<3Ӣ<݌[і .N@789AthM\ya>' AyA$|_uxGNG??NKwtGّٙ-Gx8rS7Ð>a`'o t.unh%֫^!0jq Mk!0<7-J),ˢ'Hk((9kFx=I6ʁK ""ɚ_?Fg`vz~CpLBփNBУIē\x=qĀm:TIYj)W`ѳ~&;!_V`,rj(e}@y1sK>E79<>S&}Q:Q]N2+ʼngd{:>qI^z^{ ^FS4l#p%/9@9U$_<Ψsak"y7AoXֹ,W]҄.tr N~֯dMzW~5z ,-Wj:䇫/?pxw?rЄң&=~Ώ=aӕC|aug; f{1i?gqsޖpt']dgz.3!a kbmr_ٺl4 *|?0ϺgYh?ik4SnoJ'3U8y OM8;>`l]yA(ޗTs|?o߇^b]#DP#bׯCRmRp^7M@mNX ޘoY16 W4-4[o#rX`ڈ ֘~eE_>}]ͤ6ʜ;ؽ{oLZo:#HZC[\%Gf 7 PsÈ7AI[! nJPDX0q1r S$r;ScF\.uMQ#g}>=oA6 #4^Is(ʩc*Ӟ|Ĉv!Ă2S5ZN,X[ijECz2_/MB}/ xsz>bNbTgT KaM|$w &ˉTmP,gYr޸Зm_q|/zm/]4!wS*FKh(@{ c5"wfɎrrtg=sQFj]o؞{ɕu{?ӵ– O5Eui.k[͡i q+q6Z //Bt@+W-y 1*rTw a36rw1ouMɬ-H%!AE (gyXίkKr(^/ڽ v:rHݗWEQ= wR(h{3\Bu0KsbEy);9:e!B*6WPOSE5KoSØQ&Pq1뉼4Ma].꽚o\Zy85$>oJۼ~>e! AZuޗ$.+yhks o`XyFy qP7;Sտ6_+jG3W~oOh]`~b[#y3)h?~{-L2cb +;_zie8|O]Ym;m.0GC/F@g믓ۻJ68>Oq<ʹ9,_Qa6ת;:J~廤HHxOQnf,Z{fq.DJ.$?Dh/RI1Q [2S<Їz9Aא>HJH]?,:jëBs9hY%s1 !7Lրҷ. `ZOi5N/(Ĥ\hw@Y>CA52Pr*|F*rV$9#a\`~sU Ϭn3iDUapR96bŗUh4oɁf'T qrQIu u8Ff?#Cq 2aB0b$( shLedKۚ=Uů1YPWk)ٳXl]ֲk\'DH|nb'Pm:0~]E.R`T'{]w~k5%] lW1TffbkTD_(])JYDR"/"-Ê&Bb%2a@<FcD00dہͥco!JN!wZKvȉ"`kC}%ˍgtHҡUU,`8V{_{wB0&1Ix좖1؀.8d_qvB]Z.L-1@Ȉ\,I.ih ʠ}zϪvGL &#;zE*t㺻x+EQ6IU565!\ Gx<4$= 줩ݓv\NVuF!h}W}[t:uqd;fl=sb(8Ꙃ?ie!P 8\'7Z3+W ̌(Zf:s^+Ǎ[!eMܛPXcluf&Me 4 9$°\ ;6pB65"`n+VPCa[cZ WjM@Vɹ(HA:@084םl6AO) `cPS/f\{Z ,L cc5&ո:'`d騏_/a3tMUv?$U/ p]^.ެ$e^ayoXt4o17Av3=[0 иbvu`JXTQc mC lzA?[Dbɇ*:4Tu5%DivͲO8]wrADXS0C*X`#gII +1Sshwn*O ȴ%V)#m*EmJU~/ '+>2_( Ãluӡ)JjG$5!Ԫ[`9'&YB1Dv/Q[ZE@12@iS $%C\ꗕHD-Qi_,.fEǹu0Mή'ylvIhGld96>K(J >pgyQB逮yrNi)Xr&i grqa)I‘nAIג9јTtmSjc؁V/ t^mp@QAD(mmY=W JZt aQ1h)ͩɅ7$QhEuYʜvHK(t 8 J|XQ@c hn:&U7ZA3Kp,rK;nؕ㗪<1P;V( C Z>9~Y,=$I!j\c :F GD^ R@ QR@tVo^Q^Ъ),gD_>V Ӭe1?8]ӾnW|с[dRi3,f:&mT*< #v|qܜJ3r_C4ӣo^]3v:z7jaiћqi?')+"52V4"EENrE(nT1Hhǽ`Y\:y8HߋEF&/t'*a`o6'z`?-奕l*S8 iRX &4ڻzy1ЮI(F tݵztJr¸P3~VNp/ל*3+t"ntb=2mV*ETd;$PH;ڝՊvdw t:_g剮g3yM_3D #4@3݅!2K?@08t)Y]o7/ ŴێTtm 0V6/K\cvj.(b%>}89+0r3_v&(2. <5$+T9dK08̧qy!W:qQ=<8wRwP|PZl!-9rAֈYGH!8[17JwtFYp s]CbjV 0q4:GZPr|G]f l&j^ChDB개Fc3\{ Tk-\ >X"Il4Ro Z6<-y` <hG T}vbN#JkefIAQ-al]…YI/ ,8ͲpF |W!5bJ0엥0\RaM ')(QF14 C_n여ihVżxʖJaN^Κ2bxp0'sgաO`9h}w9. >ufu:@08$cŽ*@)P uq QBo8P1h%RFj {fe1{Ng"v}9g[Io^V1 * 9 CD 3b "ՠ5\h'M;`׶8x/$]Fd?@08fxpKkyhm}y=[_evF'QJNb d5Cu8l>GKQU#4pKi8J|͏U2-N^;6iz;k>5W~{z393zA/sj^+y뻄3WiLrWԵKh.eZ CڀF[-~m=l[oI[] ݸ>C׋P "9C'qIatuCۆ,x*+b58&bh74a>9+BJ)w\h]-1(7? Jʔ9X)>4Իן')S;(qpSu{p*.қ7:e)Wh&-ߖ~G;C,wƿhcw2³OXR ~}C={SsrEo`MϳJ毿ӮԸvv/jj=Y_/*.9{cQ7Lr*[Oz?ǭMq3 ܋~=k0qwzN+dg`-gskXO?Gn޹{c䥝5\?[f nڿ _~H{WѰGa:\˯lI'${Ҷ7yK VIaV 8ZJ!)W;/ǖinB{{pupPpE9I.IrQNrvQCy%LHf 5ҋHc`ZyM (4jE)8;}/9VBQn.Ϥ W?a3T;p/?|{:prgYq?8>۱ˍyޮ-/?@%C>p9~xr}!@,p,p,p,pԶskRV$#ZAN#r&)ir&)iy1r}O_԰s&BKnkeL0#$OGIck!tA hվ#,$ Q Fv1k2 $UʆR&Km䌸>B#n!y[P %o j{[r ՏTnVānN5&0m쀯"0;&ɣu.ۤaNR$aNR椝ֹq Tu r+9X+lA} WU>U\Ri@N\} ҹ?;5eܱKjIJjI;>  '&FKC(>~z#K h2_#Kw%~w$ԏ;z?Z PJ"6,͓Qvx*0억)RE8] +^=(V?%q6ܺ}vq-.WiT|*n++Ƹl"@ ?Xk]שO%6q 4d5AY-;eӯ <7m" '+zܪ7Yס8qG_߿2wMY~sP9{Ek=6 gPMuSFuUj: .Esp<]ݞ!ڣN|U+pH$QN#E7߻5/ki/ F6gUXϪ$ɖӸ .WIxΪOxhPQ?^}W]~G-TT+5Vg,ϰy1}ϫ=\/n-:[ce{Ŏ>Mu^=G#Ra^Tq]}nYRgΫ1O~g 5w#|\! v>l"ױn4V=3O5IIDIIYMF%bWSUwŦ};RE6~Z (?)GӧOki82MG7ׇ*jȕ<鏗*ﵮh\GBq.Zbi]̷FMlyBw+?DɛrѿҴ(Z^ diW}ozW8Fi\sԎ-D\m9>T֝&5x$HTtwyr+%_w[."(6UM}Dc)Lwu}΀HX7h&/:|RMJ7ZЧd:J ⿎%Zd"iY"MS+-] @ 0˖8V36x ?؁Z'q͸cNL蜡4P[sXvvrT9l@7w7:::f{yyZO0Uӕ4:yp&|\ uֈI 6»qj +$:l=ԺF;Wrxf%a0 a!.brP"3TſoXy#=M`0GSՅlO>ަ=[n4y28>ƚ<%U*%ruoB4&[yJTR2<Ø'FZfoZQozYs<_)䬭UʒmfuPhΠ9 UttTY&v L9kvq!0hPnp8Ͼ]X++ؤBOsXZ"@D.}jZ}P dxjq-=RW@p5­D.iD-,J>$B ]G a\uro]]%* ;TW QtV9,RE]ݼJTb5w4B)%",I2$r_5d+uUR QWRkYK|muuKQ{7ş>e_}-94vZD&%)m) "S)"`c6;aHph8Io0|D8 6H̃ (P#B(+-%9GP+ ;BkR 1X`t\D#Ds"OOH#g=w%[rO^sm2VSWLϹ-y5;?qC >@@JlS0u@PnSx@.+հv|ZsX w6OITI35W(,׍;6^3D O쁗0 Tq)šEOGBE۱7)*bT*'pHZ o k-!X*ZVK,wL #0T\IB((IqDA`tDS%+ *%GE惂NMGRM;MǽT77S}Ak^b~4og ,M퉮KKܼli+,-5&(ilfvgd?[Z}!ǟZgE;'*G87e-r6{z4>y4|H {|Mg3&sI%e':. D +YQOWMwq>껅6v㳧xٗve.\K?XXp֯*ZmّgVt]Ylsns!능Ī6,^iHDem]&}W|p߂+^s_esu4D{S(sVA-m=3{/h/Ys{eulǡO`|'\ T.>N@6<)e@WBT j`aPQ@b.ׂ O` C^!(dP<)ԕY"tTWTu9H>Q3-d[df | '0EF3fX)ezPo@N".F_JqTQܦ :2,"`#aREbD h$j v'Z1tpsvzȦ&ѻGsG!ēoesWow%'Xn} oaVrָl6~o-׃ՄK*Ljd<)V:~hA #(ar7{݃ſG4f"븳|fEJkBG Mq1VYE)'6ʌ ;6`#*$;A Att1kuL'zah7z;{l=/iC)S0{RMG4(4do~j>bD B|k|/:fyDS"Gnqڝ%^iU$\as2iMoPm@ =>9& b"{&CDTᏥ:KI S"!)M;s$b2B;Kk|49K㛃aiVH֦h'ovf_^-?-}e.mHϣsW$w녢Ďؠ*^B[_ |zSMp&+c7+H_&j7d#:Ӹq6~N8\.{C (>"+_z}c^uBPkuqw8jwȮ3Id#~JޖD Z~2|묲rrA2k$sA0'; ʂ5+ fahmL֫ L)2]=MઆOs";]$e.FYXjrvM[-.W{(M?VgOŅiTbȵݺtjޑ*^Dު5*נ03h oc YӤ,?|oXos[T1։.ɀ^!Eq$Ͱ3v.'-t6 FǷv:ZʡcC|S^`te?x8r. 3:\.+_0q] |&cVҠ2D:b!B_B>0<-y:8L; , _@,z ;H=7 G)6O.C0!B*Caj3jX2b=655[!-E~Э_mcHjk.(qCvYҒPNbXXH.A3o?okOVz|S2_|/k<'9F7]ywkBHq|l K%bdF/cL)DފqF)%9T̷TV]|_2e=_ǣ/i:*PQk %I/P!`V0/2ZDJGt`ZD"rRuvA V IXBg `*S:$ rl-!\0Gd c! #$$8XG!1z ",iP%LR#1s:օX$mA.)i ʼn"j4WQ'^,yv |9(Yuzwٿ>gwdqB10)ʦb."\ȤPa8dTa*%Ҁ1pj0$8l㨶cB)7j~cv4i]dKļRZi]I$^ P.ZH%Q$1$vdzٍm#?i Y0Mz3ށ\zM Cxx[5SH y\t>ʋ5)E3(QCSQwFIO 2S:Fɕ0:z&|BI3skHd"tJDĂ)1Z:ӰiN*\~Z'a V 3GsmmBw,S&ܘtG G*qcA#:0\ HhtX2kXxAУީdo褰ILepBz~So65F ouE\;pm./Q`ǧ֗lG֨n]vv,W擪vu%=ϴ\OpM͵F7Vyس|CǍcΎ™YO.'vmF]^/l/]o?eK^rl=<̙@Áɍ~k><'€y+Z}a2:x* U^AwIIjd,(Qcbۯ}G~wQѳ6`_Jj c_5dو$.5pd2W oޫR~;(KM1si+smoηBKcou߫J]3 W+-7o× UrGk>t#z|c ;gkc[9:۱-]ߥÏ6mQdT>$^}TmǻR-UL^ۨʟj׻Z> Ǩǽ|c8]GfA*]zmk}Ƥd&g-4a-NF=:})2~v~ТrZ/.zC[)ܷkU}Gӕ 2R}=LQ/!MaCLo&.9|#w^w:~cp3A e) dkMr+Њo,yG$s9nҌ/~+>:A(t輦6 mRbG{{}+jqrkIUS)Who3gOxUE'NWL[y|or냆i+K/oZwv^jIN6+8XeJ07"xsz <=Nr>׻q^tͷoftrysv}gyXY1랼^1bY)W[y.t`$rԴHoƑRPvyF6Y.R9ҙT{Tsه$+X8Bj . (O{#՚!nC:pqPDtLB1NB81x <Ş8b>QJ* +!"9Z9Fc?g ]8MHI w 'dֈs,"'Ai#3,_s0cB|s3AlLR42""#IdwYi)#-H2!'ǨԒsieμ$<틛OiORVm7>["Ѻa{N{:lߨ;t/tdGTlEw??\L3Hc \ HL+:[k)%HN)y!<4JpL;j 8( Q<1C*&Hy F4c3g4YNmblˁL)O†ܭ1Cy#[GûAeLBUGO_ 6ow fBE P1ń:+hǚ381fz똷6扜48Ηl w؝0x9;w(v1{:R+c][kb>'#a E% 1B`$Zx1Mz $*goks/PlYwKQ,LH{'im~Z"X^04cJ+fln[e{-[m*m_"oWDi_ԃ)8E ~? XMH@qfrhO*=u&1ץIR9|&40Sn4^,k^\d0ZٓG,1\Z}nw-Ax{~~=/t7./sl֛udcn/~?qV^M%rwW0TfFu|AηcEߜ.wn 8_nd@`U-wUEԼ˷ w3;s`YwXS1S< u< V{P;YnԳTjiw ?cm yi x9mr$;Io 5v-vsEa<סRut +: )_>f׌Q'h>r}l0|ʹ%s򀚆;P,Pj3yOh6tmvC5qEVWa͇Zu|ٴS*(<C_/E_.zųMީPԸq:8)p6s^ɷ/ͩb"Htq 4#,0fXa*Oa !2ᄒ )ʓ)'=+޲oBsY뢉;]TW/T6:,y B7zz݅)Hb:mKdCm*N31FǛ e8Brq$r,jƂeKv>]MoLGz9X@, sY<,>zhg}q_+<7}UHTc411 s[ <1Q!o5I,dv`cV{e>z0S8&.!NtZ8, wZ)= )D%T{ X&@㱎Nnj{M*5 ΈIR)3 keD8KbV8b8㴎viGa;` .ʵu%6ZUGq@}ؕˋ;bOS}ϫ )4EBy7E{)r@q 2IS=U<+Z>k)L\)UQ'Q%M1aJa8X aKK#ĢLB3ccI -I΅G'꜂?TIY*,\l8$Hh4'O@k(zkw/??o:OG115SҼU&b ;HqX&")aBcH 3LN=TSb%D)FI)(g;i5r#(1Y,힇4'?v^YuI}xq^ IHٻ8o$W 1*e `& ~ )kHؙ-嶭mu!dO$xx"U K QV#A͚xZK!v9r̺Z1~ tts#w}"JvuX[3QLu7Lu=|MU9x)}w;Źr02{v߸x_sO|vb)7,р ԐI[kr%5& xC`B%sO6GJ.m"$|LpS#y Z;W>`48 cqyR~mӍ鴷/Ld'+7?U3~yίNa{dc{C;}w]Sש/8?|z;mݭTq;- W߲]L5ҰfW&$vVYr&'/fɳy/LSV~ž?#I}:y_A^˳E-3u ̼\/&^~4bNf_avZ;Y\Tn/nfMfu˻E7uv{4[yZ owKU9dpjX⮚h]5-Ww㮐Pݞy7r:퓛peޜ/"5K1zL+nryQ6*Gg܊|+#Cvv}gMnޞ]]_Bbp|)NF|~娲V (yY_Z^#@M$N[{0ZqG嬦7%72R,$Wu+\NbVϤJ%љlL0Z]m$* I#uL+~YpxOdUgp 12 r"ypI8;rfM]h˫.8]M.L'RhrFk 6Kayq*t훗okU؛c$<tS(& /Nj`w QXUB0~U| 9F7 k2$ޅ|ѱ8'TE |"ےCC~=h8LbGYh)"/^3~XBk2b.$,if?FB^Cw$/- yPKGHTJUm,@)K!6?FBdgZFW56*L.zk?FBZcelMkIGlp*a:4~.uZa"gֹ:Ga8?HMEU8?S%3׊U)܀͏ɨ̀#x7Z8A^0 i[ `2d<ۏ<67\I+4Z ꁙ#xg l9x)8Gq kI[vVLH"9UNQb+JHve Ώua! fIJU(NC ݟQƪڶL| cc9b*9z =k螙jay U/f2y E#xTe@ɉ й^ #s6%GHEmEZLW. ٣\R~]FKPNjE/*%kS!#$^T 5s aj RLTsLnj5FB#G&̨E-*ch v<>evMER`E )*pC{Hq >4(0nuǠ5]d,CeH6!` VO)UZ+#'ig HTǪx9HT[l"׌PPc$tRv4)Zͫ=O x&90cQ5$9c$toeoVi AN [1[%hkY4dc$z|q*(xB;'6[Ǯan*l~5Sk F8gN |Қ8u?BBFqzJ)Jq"BEJ 1|ؗ dET2/ٍCCg H>Zng"jׯO&SL]ΦXo^}⌐ޜ],.u~ɥ"]]Vz{ឳ t0O 9ӧM[GiI+|Z@3CrWM{j?wմuiv~ܕ9iJzrw8{bw8u]=N];uqZd{#ܕ=M^IcVnI<-<6giziO&_WU.nޔ^٣lP16 ^v!V]elF%(S/D+@RR+8o{jޟݾjPn3~f/af=f! # u7/Œu>Ÿ|vי}->>Ixv:~y!f-9rMciBO/DVخCJȎ:%TM{Ǚ}Nk.a٧W$] YC'mdJr?Г6Mtȡ:(19"J=jEglL maVTnjJFKkT5;ȀVH_2[H>Ur2zMγF Ѳ'H(:vU*4~!Tv[&޾y RG0/(<$wZн@A/YpCvY5hn~겾e'6[SkjmPKBPl엢j Ivga+a@S8ğO|ז;<_ԧM?w٨qJpߦi:~Q!+"ܘ޾sluK ̊TJ 6b0(NEIa.Tʒ8v1r{wŭmx[>}5X((X{_{v_P%M$NJU\qJv0Ig􂇘{g8Nor+޳qDdҚjW)*Z->Ԝ4;{S֏[d:,J% 1{T Xh> Uo3aT.^֝؋2/KMYy3$dmfA8ve!!+-x.auc[֙J!-ɰ( R[ix.. DvP\ KP$XX1Ȝ Z)]EhwPLP0Lձڲ%0* †u p2]1\M 6`"8-8E=ho}ɇN}'hή!W1HTc4/ o3Hh! JbY#T±9$=t2JP@e7ˇ NomKp&| CKNg6 -nS/$mfχ ocN4z|x&!6fsy|s\_YwE$s9GT7A CF8%8:r[Kd\)̌ 4|ݳR[>,+@pBsc,[LiTgT+y"43>K# HDf[쐳"৓;n*DEP% hEc+: B2Z1hpmvM 0}gr^Կ`ɯ@L@jEZƭ\oOx:Hgp,H%!!mxDQ0nB1V^j˥ƽX`y)E#3]d0;"^DHU n?EeGSKc)N75P޲<ԻOzZp@<8 n3. Zsf)z֫$qPȣ igi%To5ETh ( 3 B<%9%x#κ4mxYQlïHvyV\/醜r0=?#9iO[o9l3֙F\.Ҩ Su *umTP)Ǐ[v`v` -r[?|=';BR>[<\TDDqd"IKQ.hA 89FQ=8Z™9,uN-m '8k\hk9ygGe ϏLl{c)C^ƫi+b7+X8!0-IinUpt^p"9J(yi`%jA.)(HqṔ9Mpn ǹ U+*w"U9#؎m1̈́> BE v1ń:|60vO$ª]C>M4li8tm2,%/g͙(/y`4'Og,&A,T{LbKzfqʃ~T@}eDЈGh> \p7c.o_^"l=GAj٧s!,}>nwT3-6:~oڕ{DYeUn\m=`۫ -?Mjct[dт+o(pn4;GU.O! A2`V<`xUB ʼnmc//-,}*av)GԻLjis w9Bֶ2Znm$-^5ZA8^'YL-#tT[xĽ?ЎsY,<2Imöy- tP[NcaΘR&36ieط/On) u&:k1lmvR+nt EMI/ ǯh h#IJ Th LVj/p87L8E3|}2}$9o+~siGuEWXZrzu ?O=#DZCƬIu?؆Y̷7>=de8Brq$r,jƂ\A֘Z$t_uo[#`f an`wp}2âW坱՗+h{M"@(Ek18N :gЌKJ@,(vìN*Ge❫͌#V%p}a0qyRB[eW[k EZ·M (W?[|ce/NP$HgA-G@nei͉t"8ka"gƻqS)qvәG/n㓮ۃiWYh>I6'Va)535[9ko]ߙZy- VꘛFKX1a:qeMPsHB{uLi0XYT3JRa-, '\R]d 6*Af6*_3F)gх8㣺Pօ£G̼:L-[kg7Sy|i'/iܼ~4-,q0f6P̜fƤ;AAxbXDc^>fK $iA8Cɳ@B^H@u!=xPQY[Bֶ/U"3k8@(8#&IEXP,H.8yQ"`P+f:Kؖ*iUҪS:-aG=AFE:qp}Y;Q5L0\Iӥ@PL.]iۤe*mҾ`4΍QC vcVKWW+!(QC LvF]er늺x}^]e*;VueԕBr!u 6AW@¦+*SK奫LŪ+%4VC &tF]5vF]ejtuu++-a]BW`%:23[Z洈KWWJRWTWFrʺw 3*:2ځJV+uW߰T볫wGPUW[ջ¶;ԕ꣯׷\w?/p8[L߆uԶ?g_@rjto%%/1*jZVjZVjZVjzPW+ uB]ՅB]PW+U=Pk uUƪպYB]*iB]PW+ uǯPY0caì^@B;UR#!@X SVD2Vk -O^p-bְ a1}P|Ql6<|>32&)drQ}đ$NQ!%B`iԸY+q9ZL15DL,I >ygLXbbRJaM@#MiapTt8\(E Xs};d'LQoaUxoޮġiq,-bUYBK^|[37*PY Xՙ\jR#SťT.XԲ_+S>w~9E~)rS8Aari}ww^45v#P 4bv1Q{h<7$ ۗ|\[%gSN Rr='#څh 0Ne"Hj9VHtcmY7ŒWWď{|0w pAbf~oڙHHEB^,*7Un.Rn%U@Vfroi8d@`yͳ5#vO.o N]r*$ f g]%t\ݞYPX(6La҇Zf'}Dˤ=@D%t\vpM!N$-^5&mv$8FRVIޕX >X pIhZsv=s.9ޓ`4="fLR['bpeQ{Od0&&9Vc2c=c߾9nݼS*-LD tRGb,vy(W&x[hˮLXDNR \BKGX`GͰV{U DEù@d %u;)}9'Rj㍮m/ .^W]tO!^̠3eiGܜ/:ĵZh]{ ~У X/Ҝ4xyLsgI IBFIT"n;b㒷Xݾ\].P,C_-6$qy_Q@;?4@A7A]*QҤ#V %@\h Ay{ƘA+P.EI'N1 LMA9uI95Pk9F|@w~~YIK_/Mc$a$j5!0$h/81L OX:#46T"3 uy5:T +r-*ͅf7o+Uȣ% v P*Z{gs0Es:0zF[SU%d=͝=)Ww(H~=ϐe|߆Yza6a,nNqL24:(a<3(7O#Xh뿆An m۹e~84bw w+]^NX_+= W65ۛmEZXfGCZ e,#fTCZ8Γ |[V%E%( 'A ,UhcP7BcŪi>ٻbUa- ТL@+㑚{%6N NQ57$=J&.8sw#փ0SXhc"Q>|o3HhgT"(MfpR9* kNOs.30&ŶS.@\_9߆:gmU ?ކTowԙ mvO.ܿx.[toٴ6}Gn G} g4jK /RѼ~خVkxoռ,\"5&ɝa2ove(}kE&]{*, ML Wz$kҜw^tw}%(1jsAV[%&L[; @|.+% #N҆GF[-$V/WXť'`l 'R42NE/A#kEԨ2Uc;n;JX?4ia^Oz#kw4TDX(3. ZsfIz$*`yc\T!"9"Ă:᭶&H\Ҟ-EĈP%ma .֜g=/ i \{ULY>T̲QŠn^Lg6蔷Ɵ> H6 FЈVb(RJ!+FtL) Jѥ  R*r<2P6ػ &I& Dk8.c5(8 I jc,Ś#)C}ρJnDa]y8| Nwe"U{0v,WALūW5vͿ2hUhgI3?v&|4_%r'vUG?G\`z h$\Jq|߼r|}&kX)e3hYمך/hYO@]>8[>Z]gyMq/d7Q;䬣'ǎ P_BCܳ'fS,ךyosELVlZ_oBZ1˻oΎ)qϒS#hL{*g Yll+y0\5s\s1\5s\s1Wk.暋.‰5s58bk.暋bĤL b @k.bk.暋bk.]qWGż|BF}&* 0U/8cb$, AFcS!0ASp2#1գ9>_Z"Q( nzF6?qy8ѣXXHd"tJDĂ)1Z**ng3mߓ)I[ۿcK3{nV]Bze>X.M~FÌ0ZWl49eجiDoY4  ,ĝXQ ⨱SVÚ/Ë߮kXg,ُ^4|$i># dx1V7 vg#:(2 CLTLyikſ0 &{:(}9瞑~=ODis1⽢X/zHsJc_Uڨ6mT+\߾@#m mD}1K%8ljjJB^\x &,ۨ PNw2>_j,[Q7l7bz D"Q=n(m@: <)7(B&wrEkB\LgJRJ lE< X2LjZհw4]X2kX'"hJJ&{;f9) &FFs;ʦr'a`h6Hh&XA+of9x]lDC p2TH҆GF[-$qZ5\{kGe'`y|tJ #8yYfD:ǎ*j֚uE:\|qMggʱimX|@uE|+kdywN?+'k@G@"Ō b,z,Y z$*`yc\T!"9 $RFC4 h%NP4,Ub/qsH:'+_ljfNQmY>3F%kro?}*@!a{"ـ;1Kn%J!+RHjmNǤJ+mHژVG6f5yy9M2Iu[mȪ"EQ<%e?ɪdVTfėq^bA yGDZ4rSpTyMHAX)8hq;vT>0,m=M[0ՄќllQBrNnn<*p)I2azpj75jcqWbz"}r| ' Wt*ί'9j4ıXi@A ƙARL???????H+NB  G3a|+8ZJ8's$"A+<پlCelo7!G5rfeiY퍧p_n≜C6.E;:v$S`ѣIZz<͠|C+XZL%0">}QrtZHJxˤQ4P + '$ /=*0*GXH>ghI)ܐX6Yh6.l6r6+G"y'#-A6y!C>O[O_ ZD" Jc L>4rH1}uH_$NGLfV{{!M6pl؋.`p,YD`\ p\Jstgg r@`TZ6!1rHmɞZ]ٹ,s!Sj4q?N[ٓGhw8 CPe"yB_nI$P$}3tyn ){vلwNۭ˸ԫwkҦ09ARfˊ9K9.ȰYti?As=N--h9jRJ9*ry "K$f e]$t\]XPY(6L< vʱ줥tQ;-LDfZms`\.F[9xAGZlq@M0`D 2I[g.v72$BIjKMn mQ8o ocJ7lҖKcoYLmμWU[9}=Ww;kͮ/<$*(Eˉ7IW Ϙ&=7PW{x0(z•z\PIdJep[J(-Bw,]k>Lntj_ {J7A˪_Bωm;#a':?2Me7-g?2o;?4Oa-Z.IkmMl+)ɞOO&h&n|bYSv\}M}QUx5L(KQYTBȔFx"uSIoZ!djO{SYNW-\O~v$ cE՚Z}aLEx<UX-;cƌL e4zl55[!-ق(֎EoD{ۉ.8a[JAQJIÌ1NFarEr;Qa[G%S& 5eQ,=c4n}əovk[oZP}*%`jH{$meufE lXL`lߩ3JΠ}{xwͥ%tS L'jzp? tc:uB񵇅ms?HMUGSgSϕ_\Z닁+4BR*Q;g WJ* \B\sp7TO{gl  FŬ Kg8+0#iҥVծEǸ'qĜa^,o\̲G34븏}0~=_Fe G.To0&~|~' ,%t6o:O[FBLEZ zhRW.%*`+搮,-beP{ͲV> VΦsO'ؐ![.VGSImŌ6>V)a^3K*3{ؙ]&Y{ɱ7R䫴LVN)bxЄ. ՗db1B^Ug#ԪدĖ#D/] \%rJ*tp5•CK + ~1pȥR*Qz1pz!>1p vr,JWs# 9AW\J.m\zpES`lz>)\|_G$P>"a_3 ʁgT;G11:{$DQcvY(L 6b=g4{pVe<v6|fɒX]w ?EK).Ϟurz4U/T\YdU_ObT6JU!^٠LeARLJ%'bYi$rYyfG8 6H̃ (P#BC(+8ZJ8's$"A)ΰM0vs5a#OjRz\Tknlg٢ bv!$Tx\kE Tr 5}Pi} AN0j\Y &$FXŞyCI>@@j\?GNXW}-fO.k>GXo;NvgzW F,B0#F"2""&Z(P4`ZR~QԻjޛ{iӋ-0sXVYxv9G݃[[-Zb}7v n>8;GE.!]$sd,y:\ V# P'xaZۮ^9n"j%5ܳ iw 2jLv7} rr [mδƃ{a6$5O| )x隁~<0 1Ľv9{ |po:eQs+b̶Ij+hMKfPg ocJ7lfiy5..v杼bȡ׼Ymv~ypI+(/F o {@.wmG״ڵ ]o{45PW{x0(z•z\PIdJep[K;{wā.5&wR=ojK ^Ȍ d93@*72f#g32RN8X(2c!/Xxp'.fYܬτߏ]q7 *{ Gl.ú1 7H)-0Ն!jp8M%ϊT`g.߆<Z  jN UٻFr$+B=*/k0F;on˒KU=`J.]@L*8< Fpr2Q2Nl> `;aXXD -# Ya}jMAt2(]H9Jm{QW8&VDhni8Y ŌI.'?6_zȍsZ^&9[i$8^2͝e'#& Q!$dzcIU,‭KTO4U KN\Xv2a2&3z˭Ռ_Xw]]3XC8: Ox\㬕Nw$g8h,8Hq0/e?ks^$5d𗉂8.*4]TMێ$8X2)f60A}}6>|K3xw<9{35鋚'3Ao8_~3'pN@%F<4x15W\t&C_NabQ ?5{ ~G)Ph/nĉ2@sD$U6Yb¶Tm:Ж[!>aJĕIqID4ń) XD॥>@bH!eLXό'!\m'`[<9h})GT2]B~#U L͆xCs>Z}ekQ1ƘsdE]ۋHq{3TIGAJ9)Bx{Ƙ$+ԧĸKB9Fu"$Z[睴S st2Ȑ@cbi-VW]-}SnηK:IZGMHY L &@ tSRj")_XWisFLXpwAVuzXͺj iW9f8}7Ӂ?^R3&-Fd4V14W0JW2ĽozLnծY/a\u_ནq/^U7\߶ob09ƥ%*E1P)YH:*<43:9ou(R&y%\ywny:*pBl5pgozuAϺ#@$@ _Cʓ[6O9WUͬP3,օCV~V6^GrMmXS;c:OZ#Q!Rۛ[0jˋD[~Y00nh0Nǀ+:#hgBl|t? AAW?~7ӻ{k6dou[pS8*K5j@^i]iI9qʲ&ZI XX⠘Hd"tJDĂ)hDR\ zZ̴~L:C0Sw#1֘ϣmw6Ov+ջ ..`SQQJ lEMzx_]yLt WR? כ,fKߩӂ{N.“Ye]/$۟6|]3ڜ35"ٰVT ]rss[dF%R\ E%4Fu(­܁"*[*ҩ:bB&+et)E1NO T9 .0Ied D:V; Dx㉎F&("83FPdH/fac\~}8%~sxfV/%陹=~t3E3˒4%1^2B>"{Fn*@fMPxP"p+8ZS+Io$jhܗb/4s*p#5FmZg5zix^YRKQ*F)" S (X!a" üaxEH@MN_*t=8D 0w{ⶀg7Q;䬣'ǎ{[8䋎?3S&0# bA萗({ØbJ.'D {QMSz?VoOHS^ x[BՃ`UjRjkV|$- FEjS #D<EV j G1m8*#=1k4SFfɃˌHy5"AEjZ`}F. 8>1m BBW=Řk}N+<ջ(F G"Ō b,z,Y z$*`yc\ԃ!"9 "RFC4 Pau4[m HLTh J\̜g( ZSVaߠ+5@0X)}5k`ZZ`m_Da\ 5@A W`0eZ*[{xV Q•s~Ep >CR)j~-p5pV*\CRTr`U6w{ b0[*zp*,g*{= %\&6`0lkp՛+71SD^ e^Z}jp]W{Yii\=T:&x5w_=";?_}`7!%<ę!9xR39o>5"!9jG1E79)F3J"SHw)O 33FF8&RKC8rN"v5R"񳵧^:RVρ@Mgsg,wʿN4/ 'QH"DE0XP3/vVH!i(R `D%U NNYxIf/GMAo cǰ&o 7}4_~_懾}I*ʼn+myG}uu_ݶow=xIfYߞo##4jJ]muI7WfO6+ީv^3\Uz:Қ_<\.܋''{ޭz_rn]tHqyRB[eW[k EZ·顛Q.\"Pl#"E ~$L09.Wt +fv7& q\2~ksU2ѳxUM Wc>myŷrwU o]L%1k1a:qeMPsg{Ǩ !`idQdZa-, }B2N[ nTid,fvdR.zơX( cXxR,\(9z۵?6i|rځßC?t wKrf 3=dUT)GNJÝc0 SlnBǐ 3(6T^r'9@EIȍL>&.9{aN\]5##`{ʙKĩpLsfY +3̸,' wZ !ᠢ= &D%TIFB{Qy|Q\)wD,?EDUUEĊK\&KAG1I*‚4eqycR,%MbVt}RlGĥ\m,KE]uŊ㪗.O/vkC[痮baEqvQ^5,f,{i.J-@B;s˔qrQ[$'b0csU"$VE(aI)GHS)( Ʉ89FQ:Y1 J6xʆ?аufiа#㚌q49OPR(< @"%h"I0U[4"TrTeɑRd+g>3q!Pw5TPHqPЙ#:r#xb҇UL8@1\SCLP xQ),  qD5eegsnM&r}Emt9,~S~s9}֑al 55y9Cav{O/jkA͗L׬΢g93Wxfuꘗs%e]=t_}cU8S&OˋӚ.UbC6RK;@hFmK`\IחhqTQM(4^nNo^nq IMsaMz': f&L% MЬ7r6ix}1s&M6^Owpy@Vۻ-/6a7\1G Dwknp<{TinF`'}ʾf6<:fX԰8y M}~E$`*t} +lZ@TI8D&PRp"j>ے&X߸eiS뱪WBz]C/jϴZLz%* {#a E% 1B`ٻ6$Wy> 4gi O-Iʲz}#ExL^R6ڲJVEUeD~MjJg> BAzUZx \jN£X `~C:&%Rf F;ǸD%p_0P)YH:*<Gifu sf/3P$5o%|3mua.pM>m?&x]X qW3>QҲ&'v,qzH&qnqу!nY|ęv?oFe S \yG;`%ɶ0n_eܾyt>ciw`jlCk^KGͷnV_#7 ^hg ߙ<ۯn;$0{BKSfcy8i.:#:Sn0#.cnV_nW ;@ }l[L2F ϭЌ~9]pB}䚙 bo,#e9!0w_48-L6JbR$%081[ $abE٦^ ܺwȌY |X/Ҝ @gHP܁4w !$d4JQt/6'tu%7;$vE޷]z_{o GMh1 ňײ6w?7/͠ RxI6ʁo 20IM֤jӐ#w};Ze)O SF&HK$)&L)lڦ+ ^Z $|. 뙱1$}LdKG'꜂?TIY*YLH,4Z8>[J(9Ͻ$?ogkj-IB ALC"-nkѰgQ7w7VIGAJ9)B1S+ZOqDQ^&ɦ:՜5R#CQ<^;oc]^nr4}S׻K:IZGMHY L &@ LSRj_b9=9սe@EUzUm[z~hڻ>;'XBl24>u:߆XAOͿSCGT?vob F"qϯ7}o;.,KJۊ.aF iŽ8CYfiq<݅yW"W%E%( 'A :hd<,Bt0;"^DPJU+6;̹lkO) ^> 815QrPp++RaPaC&RX̸ &hɢǂ Ι&qU8IQqQ 䴳FFC4 hښ"qI{S J\Lg`6^ݬ~@ׁx[P Lr#7]w<=O9Ef2|z*<H6 FЈVb(.du! y.$6RcRLQeUd.HД:@U੔*f0I i@Z#x|9DH98pդ3c1H bjj|΀F/841e޸[ٻM~0 0vp@Lū«Jc/T0<|DPq9MF^*fMPxP"p+8ZS+VwR>7KJKPKvʭ̳C4BF:~,sw}-4=}yԍ)s{!ho MOTufRSn~^nOFk> 8~laOCw8L[5DZ؇3׿Q//ח|K>x!1>- ^U1^pƤ^)(KbLMɣ^$Z07O#Ĕr7bgq͸YZ} Y.6QRak6GM5tؚ[sB ZQfj:lP5tؚ[ak:ǀ>F+i*lA#VJ3[if+l4fVPRòVՅ4fV c+U*9@V?@B;^GBMJ$2ea E.jk#cAĹAĹAA)GH"3i)#-H2!)ΗN12Q-c u>8W`5߇X Ę)M"*om;qhڻij ,%/g ͅ8%,BI`fH`i d(iKc`qF 3[%O(~؛GBFs0/"SgsfX˰:"&S4zW{7JAf>_j7&wuLA_.rZSՇ|\?'x:y_o8oͭˡ7}6n{3Ͷ 7o$ݬkJ={{U[VUBoI{*Ȫw6ĖsL~ǀ ,nՂOF7HGQ{^C/]5sfVŬy9\񬫆nЫVgVPՉWsyya0]J,;y)%ԛҰv V6.t;)"IrdVBҭ6Q`KI?'khP<቎O,-m?; wsyt6reځ'u(tG[P;jZ8 )o{Ɯ1N=nNǞ}rr^9śNYXg!JE_YL}kPԸ8Ny N F?^N~9WE$`*t} +lZ@TI8 6!D&PR"7y|<hu ;XjG5#V>y8AX񂢂mUSBx Duu Pe&2ͧ!9߶R1z% q,NIXsUiF֜j+"tB;0Md olVx ~&˧Wf"Q'o3HhFF%R ̳a'Xz֧!O%p}a0qyRB[eW[k EZՃ0M (qنce_LR$H RO|FQ&֜H'S k_8U4ލ5XB(~QP>_\lܺܟ6MIyœ4NSk8}}f JZ>L%17\1a:qeMPs1ҰxcidQ(J<7^:$F(3 8l :%nU "M`J[b춌QElaq-my/cul~gYO;~vß@<(pf-f{pVQJP9*yj wrL9H>gek8<@1s2&K$3"IȍL>&.\8-Âsy-V-ծvG `{ʙKĩPLsfxV8gqY\? wZ2 C,*5!b/iDrl|`]H"`NuTt#b춇S?KȆj,bqET-ZĭᴳB|Hlʨ])I]r [ SS81KuYR^ Rq$aAqqyͣ(OrŬ"Sypكwy0+5:͒C.lu.^:x bGQ̴Ч;K7zʛMi{Y^5B}KMʧoVy `Mn]n1X^P8agJ{I_!i tugfD^ 4=wX{?FْH^(/%ER*Xʌx/+2BEtҀ,h۞䷿8^cNQ=&DX{(K{>f%S-;$+7ʫ1kB3$-K-:  ,Sb`;,'¬fJ4UȺ3*xT;N 8dzdno7q@qv[3NG#uw00@V']jn4!7};} T%Mh%nmc+]v6, T#iZ"=(Kv(iUFe֨zxAkfJܹ qA9 R$n Yϊ_æt {od:sR? 1br=WIahQχZV@E6i-lPxD#!`y]}ڃ0`rJr|6V,VFAxPR>sVIʿϟ/4(E&1pNɣ 2ygC#Y-(,*]e*fQWnXO`̯PlUnV󖎻 Ce_%s3uѥ/W͗,T$(h(䵍5L7:F"G՚Smg+ dnfMi+v ҵQ_sgl~nw*Q8bO;Ob'b9P9sr V(1[ y\)*zU.EW#נkV9A{0I!>)M*&#=b(ms?I`9a)y$l㥷Mo^'1%ebן3(HPKkRZ)T,OmĘȦs)27hW5K[LrzY.-.Ro6R]?Pާ^xvzt^q'4J3=6 軋+mS?|u%om1o}3b+L}~Ey{g*==Vnwx.;P' b Rjp3چYt &RGX[Dp] QuyTGGS\8-z-3(lW%+۔@$*u{ǁ٪Kȡ^jU[gtͲ9qk.=›_^]{!W6mq%Z.%1ooJ,o>^ngWY{|ԅ|_v>O,8Wz N7*<'_s2}מiS?|M\5?=u7-Tqs`o]A[͕Dmf|(q_HR2kv9ѿ)~#rÿeK:iOX1w&ݺk3_>>IZm~)x~=)ݒ]$S HrOdg#@"Ƈ1)>{3at`ƂDa}B'>dם?y6,o <6L~}h7'FˡL'jioZJ߹\ n_zn&=ޏLԛWAo߻OKw~E2g>7$9ъ+z :Jܔ-. u?TٷNuq:wƉ ph,q3QO[3bVzc߽3θE^6}v0*O"B )QuC6zA\j;ajKqAjK) vps)ԟ'P(z/?ְ֔~̪ٻVdM YUggIx=xU uPZ Ph0j3 Ėh͘`B>? GZ_\\_al35Ƽlߜml_3&Gw A0}Յ~ӿ>?/b_F1G˗&[ w ? ~ctUP^ w%UNէWՉە"@z!=-h0sXMg7Gi7{+M#흁,}]ނ2_ \˵/Eg.Q?1.GwgK,30g ̺zw`♍~aG1ޫ)tP[a_/tCBV50Ǘ&U:;&[XnJ gNIIlhD]л<^YŽÿb`"^0 \x}No;˹9d:/)~dvUC2V:~"-ҋ{^U )(@3`Yki)h9(({TԔ3ƬiK|ؐ"6$ ϒbr"ГhA|i j5q֛h:/Ei؃hk5=9uZD 785QZrʤ=OC@ɨZ$N>,Mmy3eUUdTFHɳҙT3pAГ F꘭ѥhrFS$ǹD j#c5q#( XXM3vBU e{< ޕ-7lF21 aOh\9}-5TviRK 9k[G#Yв*b4 (x/@B`"cf6U9̂68__9ry,]M;U]RѰ=Ed\#`C>'2 d) 8OlD`U,vɑ2EEH&&HH^Fc"I;WG~vQh~슈2";D\x'vb6堒PȈix}R44h gLi$n$YiQg|`RIq H$1iX]$jO8[Zg5-Me\4.vJZ}a5me< bx ~bMmWeu鐇J J/[u2I\J/ )w_atQ^tzn@#Ǡ6wLΘ'PI]SΥjל'm{RsWb<'h4)$@fR G#y1gBJ"2R2D2u_{٠ =] f{޳BK9}N#WGZ Ok OFM/\ۛvvi[cG枉i׆ҔѴ£2 > s]ҨO$̩Yn 5S]c袙-1wswL c5=!iW *Ĺ=>w킻qb;( cg"2i5IQ(ɯ^rpiC5R&zmwIr SBj}!([R5ʅ]掛RV}.KVT5-w4V/ 6]YcY&KIIV&.1 OA ]EJ14f!'m3ˠ, r~Ku26S:jU=A5IѢ~,)ŖGQB1#`Cf Z PiL84U0,G^#Vhɪ$9جjEϬd`4zøV'f⎠ >ۛ #3 \ 9( '-gFhP7[EŪ,JD\ZJN8xfaY)lw`v l/ºQ'ۗ[S2:tqIVg8ur,MH̵IkZJ),S>K$$`PS|!6'I$\(5Ѳ’=qš_ dQ;! JL;+(]m,9 %u)[eTYxByGF^1wgۖlaN)E4:PheMX#QY$),jOQ$QPeEtTK&9@gF{]` |:XSY ȱ3Є~ζZ*.>b8-M=n2XCGv`BV㺇./j.\ۖf^>UBb|u:%D `#6 @ם;6,9xZoVMH8 d3u&I|/ѭń]Rvt@sR|1FCu]o B /+0#-ra]&~ٗyZ?Owk.x %cc;Z^cT/mZIƫE G]iǶ d髒ѫіBA˥{wS@yE n:X@`AZ zPKҨh)d+3fS'kENhl=GT%YJu}x3 nCg6GsC,TGחg/k@b^tH 6Yre5k$ Gdt2 jOy's8->L}}6Fx6=RPnF] \˅n%LDS@ʀvwR &bIyǎi>,B9^|6ٕ*33`pRɪ%myEȒ,R5v妃EHE`D-x}4" <@{pw F)v !P}G YҼqȡ(~vdo#XWH,vLkJ 0;̀ڠRt?z+T 2ht@ (*/V۔Ww ]LֆMi:X}uA)ߖ1|Vk$mq*qf*Zj#W1c0خ yNmwWm*5fٟFl9GQFhn0&̀P~<aB讏CנZp0)A/k CC*C1>CU@FqIB QqtbMWrlAHҨ5.ΛY~DZݟ(d.:T P GV߳v~7_mHEJ_.Dd8nn]Jes,,!R2AK=zSbovWm;9ߺN!}`t҈6W [tQ'--wIj"[4@&]:*zkT 42+дL!4ɤ_V&%[j@ZVk[`E ^sK]1H#]sd!KߣeyI)je7} ^~8߼|7~nu\~s6?u%u vqg9 ֙};3@>,9 # 1 2c-'ɇl<델/gg?<GķvmHT,չ5b@r`4 gGkHˀ˳@%IQPq3P9,EL1w7"Ҽtz}~{ݬ~y^MtQeنb2YDz!x 쳩)-TU֟x4{ aOVX_գ{ySؼT4߈GuJ7fUˣ6oŎ}Unrby߻HIImβЂ?,*҇?>!Z_K\ޣK*Kmk[5O)j>Ϛc e#g&  ;+EJ֦v ʚziї٢ÏG>6%ӿ!J}8rP*s.5^S'|>QKr\΅7>j2vmDSRFZpJ]Vd& `br\غQlK 92"WOWAghi兌\2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##>2##kgِS|r:[\/VQ|yzΖrJADhJdK`"مiҺxu;h|-ueZ=(*S"G`D6M9kuT Ih(w24{-j1ۺ.n"mȺʋ۳tRl?b8m,Sџ9K 1Z 6c9"W7 (Z×F*r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"/r"5#FlՋ$6bsjŇMƏ36yQݻBߩ[7>ο&z";ߝv[47iDNk_-ԵG>Ɨlp-հmjI2eJtp>>^}Wpik:0td"y"? 5mUFp[0ϗ^ЋwAy<zD|oK6R<]);Gym[cN+=,}ӡɵW&U51kPnU ꞈ0G$"߄Ͳ6osw @c9}C=t'NY x`뻶? Wooh`=VmvϩW]>Z3mCA m~?ի>u?'?0:z G/͖G>=s6scOi=H#*[~]{7?j]g8rw>\8`H\|ωKglp p;ؗ_7kOccvgXϟ}, K ??i?qGZغ5>Y%RsCw ߻ȓ~U:sdn_Kna[O/fks0.鳙9׋3Q`^1yb)&xßft g.ק'q'_xR2o߇ )Et%|U*ЭbkkoIf,)!HJޭB1Kyl̼xށ K#wKÐ'<ۛ A>Zm?]`EnН]y6bVGřk2]b]ɪQ:UzlA-^cȫ 6wAY\so9l1y&{MU]&"qX8|lqboGWU/{q =l!o^̅/W'c60?~?AG0M.|%XҦa&ZE fkWxZ3Fs>t#8oYo`'mb泃_hk|;XO b/, gR. ˑ_PH'nZ,L8bb@:ZՉ($QHBL.]-Z[S+ƉL@UU璥Ex9h~,a?2 g?]79>_RnX.brsXwucozc^}!j y~2WfnD 8#rJ*VͲ2G̾ )2,6hW;B$+g2;p.׳{D+%H2J/r@nL)~0nL=܍;܍GkkwcQzInLЍ?;r`UC)W=Z^{Q?{WVdE4YKyL7P~i*˖E1>\$RIJnrP{y7q򜼱Xp•ᵝ U5Xj.Ucjg+KZ  ᪚Nj8zV*'ڼj\Us8WʅW?\ U5 dUV=\U+:\:\'~.@98\=`p}B;L.ZB_I$`Zs@`!+x{&E(^D2&dIJDxHH_R` U7&CA. uCÛɿ&Qt7ћY|7[ K {a~[-w%e)"MtښYϙBdeJٔ%mv;Lכ<ޠM=W:Wi;Jd?q9\RJ5-^&[^Ol-7+8BZ. [Hmp 1*A($Gj,Kȉh..Ǐ#ɉO#A/\жy:Bۓ%g&=C ;6M|q-,υ%c)x@(I"xD횮c\)jKvH"y,eldvsjݪxUF-3MHŪ OF"\ȳVcPЊg:?ft.O0h*YݢVZt{i>(O[:`)AC'brҼȘ(s)cцКXz68//J"({KVcuVwwE'TV>Ywmh .[^ź:\6]j40Lx3Ab鯛FG#GH[YdH3acӞ^YQ3 a".EtF)`. ȶ Ir& B2D 43$fKJT鳦 }?jHiZU^&bb0l˘Y;e8dחLВ_Yv%R2wXP J"lt%gE%!H RIi mާr%x1U8?b{N%|ת2L`ٹ선/R$uAEqPD$4ƠOMXX(Ĕ“H#(}X%59F{I> BCNQAZ73g߲/~^CJNY.] 'mh"tu Z⦤F'}g`fl1drǫ(=wԜ0w\Jm:C@,<%Eʣi43ib= 2j<~=X+VKbDB`5jc"2f.x덾Ӥnq>upnl~5ٿ?cO6riA:v+ZzՏn AB Iftlzp^6\(֎Tl}e +ߝKW^L_;͏Hw9O i_r7\xV..nDzpᮗ)>>5??uşZz}F]xG~ߢr@HTz}^!\"֤3[!.sewڨҵ{Ǫ='-n@"2+Tr4J!BGǣY,[@kVff۞Ӟfe_UҞپ7ʼ]㍎)HQG=W&gT\P^bm1E;e/[^*&?Jڥg-S1RȥZ[Ym84㸏BǴeZyKkohf^?!T{McsaQ= H>vs<\yiI4yľ~v[tiC b0,kXP1D뤭ts> ؞JqfV͸oc(5>tvO?)7(CPR`\1Y71]ĠdPzWGǴll1.,O4wh28hMJHD`m*"d(86XC!C]aٱ-bcI٘Wp̛߳pve5Em칃S9PBűC#!]Z쬳?`s.Ud%ET|S%OfA}Ûw7S޽^K; ` m@50\p{YnWxbb,rr$0d[2'ׄ y;Vo8#c¤RPڿdZBH٩|lm :_A? [бLavwKwlM/vv =Tzj4߷gg"_ {o!^h&Ƴ/[SٌWf룯ϡE":7WgFpi鷌:u-׵aL+%X\<˔9vңԒa=ŜV%NjIocԞߏz2QM[AZf˺t9Js^hmrA=+6kjLoF3XW2Q6u܇_NwG<;fw9=e=t_z`Ng>`.//L+SWN]J7%5m ?Fm >=;%ՆrDŽr;=淳@5 'FRip.K8/4ޕudBTb8:Lyht&UV, Ivs.Iq"%_ebgNݳL!Oi+ 3j Z $;4Np+m z|q۰j4<𔚜m8 jZm'-ΠMƼM+eRdxilÙUPiNYI@'=}ͻfgϝKZbPԤp8-tqRn+LLXDNR \BKGX`GͰV{U DEùjNd %u^ĝ=SO`d*FϽIYtD-Y ujDBß-:S9Kr0b9ĭؐPV6O9W>q;xv٥D$ٷvq_.wu=샥ZCZzfˇ8Sh2T2(0!Sh'!Vt0X6lr%˙ }}6O5+ VLzVΤ-Ž>hNz|8>?O5cwͶCzO ezuz&g:XEApx ( =J3VXQ*S#8-8E=hw*$',:)&FFc0Sʅtj&0Pp0 :gѨDQ5rT~!\l:#՛b(/!?agLu>Uafth]+t7ɍS е@hG=9i_2z֨n]Kݼ繺uoݴzk\‡yfzFkn7n̓س|Kƫp9c¹Ugbion|[8Go1>|^T}rm9ؔ+ DlEfzS# E%4Fu(­܁齑GjSk#u:&[ѥ  qJE>p*f0IZ$,5hڑH\ Ot42Gָ-y1c1HJW.5.w* u ֳh? VCW8MA S=L}`M ǫOW 6*haq_ Y0Y몂 1O+&JӯL>[PRK9[5vj^~Ԭz~rQ8iPUj>ث>sv6"IJ FK\5}ݹ>D<jW-;xfv7 ζ}=wOݝTz 0\ov.U<~TɿT7 Vu4tnՍ/dplr9v+.g緃7#rF)eL_CS"TP=yu: SNvӰsY,!KCDKܛlxS9SN@ꃷ?}?hA2$l[tr-t {0JZǬruDƏ$&4|.LPxP"p@F Ԋh(cțn>"ϸ{4'[3^y=9m蠺ѿg]<|KivT7^(-jFi8qi%6-^6X@}4?c2#l&̓o4V[B֥z14u=]8N ֱylH Zkܷ ]W.8R:)%f-*(UJ>'AM voMңi$9[L=@pn].dFȅ:" J;)5@GXrmvWҥBw%\EWޕpW2N׫IM#6yHhs?e~4OB&W^IKh) rU߮w[n)/O%^"HKHX&j!-" L%tWNj !T#:$_~<.] e _H"ZpW}KWźEXd%dz[{,{{vz{ž#5-D&7HiT#[S,Ym9|Lg5o毵θJQ92y!>9yN~ICx5oš|F2k#lfm %!sNW3fG Zmz3Co]5:&AIA8TU4]`*wɳmN󎳭SKu?q͓ vڧ~/7aw7onګ{Uo):x.Ht8B~6?Nyv0 g.Y7N1Lj_Ň{{#՚!nC:pNz}=whWV>ԏyM`FbAcu)<{Ø`Ĕ\߬ԃhe0!H|a=36p*)c'20Rx=qh})KTZMg{߲:?[%m\xSW7䊙x[mVVj%[jOd ,5wh&MO'Kd`[JB" C`hd}2>D XAXy)E#3ˌHy5"‡MA9u,Hə*.Be2j$FY$:`tN,i #b阱qauS{sQjuajMK⾇(w[1vsXX^`Ha1゘% .8gecJ0< 1.#RFC4 h01KS% h(LY`-ŦW٥$xWeS6E@p][OdYJ͈<Hh};Hi *'dֈs,"'Ai#7,_u.~]0(*V(5rϤJ,bĉ(@{eJVKb/#2-7dnzV!xU2:m\aSGHH+r6t`/ vHPW|t[[%;g_7<<[|)[1rOp=x\`XTt+쀪R:j=N8Pgh,bQ"$$ę]LY?J MAewO$ B<3p~Mre k Վ #w4ƻHmTc pA1NjA)N+q/Kw˚{XYN18LM q2-s]dB)5mb)r9A>FA;mQ0bޡp6[iKS_ 5 <%?@bx5,GΠMƼMoRT]dpil̀U`pI_=_;>gO1%_~I5<(sr@`GɅ}>%|T3=#bi=N,'G^S. \ /ow`"HtR0Z:>j b *$"N(qz}:G8S-&ϣB FЦ^WU!5u{ząG/q~%C)vݏ㹞(\4 =,n]]9 0/UAǺwt\̶Z|̻,y+`dkuio/8cb$,dX!TLDKjh%q.jW[di!r_\?4]$$+WYmG~aJlÿL@4-Mnsd7oK2|ڝIeVWKT;iedPeڅnH-]m.ݯ4~(\~T'oPGU4A~KAc($gaDE0XP>jm^QT+ ɍ\P=Ф46 x"p3QD)p`pc}$B NU D)X;MLp\9C80]'J@,HìN*Geakt:tmGtJ~_ǵ^:&1OqRJhLrk@QK=H4 P".[ ă2HG4ɿh'>rL(LkN@b־tRΨ|\e's68ԄրXW1K HtC.dB;3x>:vCa7><=2#t>˵|G>:؊mѹLox\}oUbw1r fٿsn/f_"yg?ɛY¦}klau2~pI'D LM ۭ_BY^XDu߀3ЖnNw[)ai#iWëT rTT7+2"Zޙ%zlXF wޕz__6,Lݡ%.IFZuVonhіO,Kq>rT^[0q,)H!sLz Һ5M.|{&a뙤p :w2stF1hbi, WD\Ag"Jy7 g* e ̦A)7FV/"ohs6̛9g8z~썆yyvbw) PJOoya Z,w*Y7j/; 5~˧CA6eEC^ 1`ʮ*ﯣ_L},TB !(Aګ^ 7 _e4{ʠLFb|hEa"y! d#}dwZQ -wH,"p Rfe&ܫ]'1ͪgSۂBXPy>I*7sgBz.gwU/EޞTBxНe^9o)6-z-%vh[$vFNFiX'OPb -+l ]eR2UFY ;: i X'p o ]!Z tQ.v=9DQ}t9[bp;CFWl b]Tp.x*~Z]!\pEWm%ni(]eS+&$"ʀl ]eZ2ZM+D*ӡ+.Et ᭡ 2Z*+!s GWXU+Z#3Z%NW!] ]I)(0{XuU+y[*UtQj ҕ2"ʀe{ f0V]ei۟2ʊ"tJKC0;JtU…UF+i*t`3"ʀe{ f5b0ƋAD)h7~tiZDWش(pEk*5H7bs$Qj 0ǎˎȰZqUĶCٴv]vz4ӤEt52\!BWMRN(U,5t*h o:]eEWHW\3e 3)ZCWfmd Qۅ+mBTk*=|ۡ5tQBGWHWRkx*~D ҚѦUFIGW'HWJ ܰcH*[yvc-[WlQ/EFo]D2-bNVugVߪei,[u&+f[-C3RѲge\w%6L6 pP*=ԦB?` J]z)ҕ1k]!`x/uUZ|ъƧdM8ճ#X*V퉮2\ݚK`OW%tsOzqN qcGW5G޽q;Zb+]AWb ]TkRr3?arCƎ',d L_]G;\Dwx)BJb a*Mgwby]h0 rΩ!UDC27g@Og#PZE$yJ n/  8IA) !.8e$c$ ѽA}.d7ͪdH}}Ǿ^RgȣOkg?9e}ǶqZ?_?7q}TOeP?6="sn1n5E]u2_cO>+ b &~KA]o3*]a8(;f4F*>ԠJz ڡҐsmA6YH1? 29FzVlm~ GO ؟AH=%MV7K=bi;bYgJ5a%A%LZ(&4d\kTEj-WEZ#E> $)N ؒ3⩣`9J>g#<)"C1: -UӲ SE|\!I8T_I}/[c7Kɤ>5??1mWt}b݊A66\Qͺ3]ysQYs7E@q3(AQy _ &Zd"єZ,4pBJ9 @O vD7)FIh)hg;e`!Q0*/ q U[7~޼_bI~Ej5{_Wu/wbA )i4&J#i0^FgAP$qBAԤJ)V'Z$|BD yAZb9H)iQ)b%+u<īk0`X,owTbԿǕB]nz8ŀcSTVnNP5K)T[:z$*4-')KTZTT7+2"Zޙ%zlXF wޕz__6,Lݡ%.IFZu]>9-R58ԕeɝ4–U;iW\׫b9.:E R<$`IOQqAZ6Up uz&i97܂/j5`v큀0G0Ϻ$̶kfhf֨K{jh@YՆ( m*w cӸ0R|3 եZmHiVǓZ]UfB9r 󅹝]KH]?gzpG`s+ c할!8clQ /=BYGӘGAc,6T4wF .&Q9b0#8:GZF>xlW6c ~EYg8K(ҟ&U8g =( XrU0LBaKfb΅\΅p,m=-jhNP4:-?V (f<' ϳREyfѧ#8Ƴ-&2]#eX|nNb:B|k"@V?(̥^Wpe#JS6 X,Pe +gIqWxx~x~x~x+'yp#OLB^q`R91s$ *Z\_XT^rӖe#8J4&Hg޺qw_(8dرbG))Ǝ^_'b39BQ0#j|}G:rHJxˤQQt( FYJh $ K0fO Į`eThU$WLr^W2tx؜fJNjFk\0vAX-L!! ?81o[WOS: e]rοy!F xvfH)їX9?c>F2tB #VGj o5d+ůtZ][ R :`wȘF,l@ ;C읤: ⮋?D9)UQfZ16JMa``D* 7@`9T!G<)%b4J8:D8 "$[^-|;`8?`\/W#`ӱqa==4nuORtI*=L*~,^ި:o']7KҸI.ǻG=q")B=&e-.@h<.1Ƚr;'$k] Js M=ߥІZ"ҺAʧ-W ߛjv_Ò_H;^e[ʡ.\O:7BBuT'mVU[T~m|[KL32s;e么gN5/8)5v]X͸,@yc\qyJc/tX+Iկ<.tU~Z9 /ҮS5RU '2yy^C>N t9@ULTi-NV;*CE4#`k ya<+G 0LwZ .P*-#c iLx4F>5]7so3u DRRreNfцP>(wMb6# jxT8!, I(J1 ᭔H ps}rs- b&)Tlvr^p6MPRo-;`r'D[+r[)snX}zlDTy)P!<+Tdh/c`hx,%1у3d(@OHiQ[fcLʍَJ6,fbȌƒG<@n3nX.fAaݠ>GFl.u)[IKNX`jSj5RBTu:A7$gxTH' `"o.Dʌَv4 .橠v68jw v30Sa`Dd85 {T1j,x@V2$PBg 'H T0 >0G:Hv6uaϣ,0 "沏]QfFD!b+*k:Hc*AYY1e*R"5DMlFHHqbKFbS,GI8I&9b]5:-_xpq:?lV+.̸:\pqcSbE`h-sBbB|$I< )}6LZ)3px*x:vCnxxr(:UΑ[Wb02رE\2я !\c(ӻSg~R$?Mʣ:E.ʟEE:36BW᣿~?8x n'a?u?b~Z짥RaL Ylp8+:SF]| /AAY_>H4/SB"\[7Bˁmon=T%6h\zurKӫ(v~_U8 ߙN |? z^uNvb"^ڮTk~_{Ƀ#WS?]œ \%u;JJupT7W1():J2|.p*i)tWo$J3+URs@WIKI:zpB\`gWI],׮ upJcʵ<#Ja \%u,'m%9uJZ]-_=D8}qzš0\=K]^xy2p<- gjW\%%=J*r.p \\ baz0R(3!LYRዔcO74g1s F5.W*$FXVbz/v{ع|!~ `}7>_?<%/޻'߫Ă+r;XU*[lݴ޻kzU ySApOo V ڼMe ';XNXGpwѻϮzL*6b-Yba4էs-@lPh3^.PLJ:0 3i F"2$ouQ* *m]J~WRR9/gX8@l)`mbdl^c$VM)@AP3[լM9޺aު7T|D}S9ӿ_6SA8Y)vx{{${e="MnA~bF%V=W8%St%bϓj>ԯ?ފNjەL^Ʌe˝,˱kˉ֢f'0 13f.j_K, 08gr%`I" ^K?9c$'>|?eqxMv i37$F_t.淨JLnHӋazFE%"6BSt:LU}Zmys}{.x߆觗EAN&)zBBb;21-g"lzR,KE!XF - gҝUz(Ii`[YR`_Ⱥ>ubypl}}/=l4+=? 8[^,~k(~<oxS맣ɞeᓻ^\_gtry9*QHr=|pY{Xat|N6C ,C)`^F>jcQwIc k*=CF$'lckJȇ^IP iL/*Iױٗ*Oynϗ^0ޟٞ~qy%&lS.6D٨ ZM98ShMNe.XEZp"m+ i툢1)HZbT5ق gȳ>BFAv5*m<4P_Pig鏇S)t{s#,Bdf޼"dCdC ES2*cB @1phTvl1rw>"}:ԪO%9V+ƈ=#(b l.ɷgdNjZ Ϣ*8OȱvbR5S.NYă&SܲxRgC6ZΦs/TP<٩~Wǔ=_CrR+*o2!{](mDmX؅>RP0[$6FXO$o)|"X.\b <@}ĐfZ1MbRp#c? ͌]uB;b^p%ƌT$z<4q}?4dr}?xOwFlID9h=[o j52ET:k YtФSQC{12`q>9dU T JUɇާ*?ǂfDZQ{dg|+,L(Rk(%@õDWbrDbY9WL@YԬD :Q0C)^&2$( r6$;p\/+6`<D5?vED#"n( S"m\Wg%b͇+W0X( Z)Qqh4U6)`qioHQҀD14#J{ g;"ޟy,8[&0=ndW\ q18_cfф1+"w)9uP$CS>Wkpg3G\. v!uC a1v n~n_\Vmʖ-M9j.Gѥ:lX8"Y>E_Vf\Ԕ7#jVL1NRoRFPl2p24nTF)ؿ'snAAAf11f/dϚ)r7{7+xN7e.N|9VuC9~}ۖuOؐ\Y>C\zSF5=_׺O,/rt9ƅXkM' #t/fm/3|>'O?ds0ɉODe z0zf{6ds1GPl4^xOks,r۟φïMs#-e>4OWu0ywi~+ߧfv_10+'iQȧQVW0'?Nf﹅i)’D>2򕊘P$ IKMywC}xNf¨5y$I9qbTAB$X4yuFl_zlsyh2Ã(ޠ39hM|R`b3]j Xv˯977,[=&\Z+?Jq:.$";?W-ʆhЌʹYs4bB"ЗR߈~$ δT0nBQt[AeՁzᙉ9Hı1qL(jBvP*:ؘU<[znہov_S(I3#8{A]PK {9dJUq߇|Z<͊oxȡ>uU]ɲ<9\M 0n*Subp* iJz?mu*G_E&tj} Z3W,Z~j>mmw:˫<=g% 0qEp4X;8iBDi v_"7*"q=]ޢcYwuOg:p.ہBS3ē *SƄ+bS%"q%$JyLۄKm ~擮8ޛntvaao+%[ލǑx0L^֯ޛ|_^y7 Ko*,}%+t|\ $OfAzBS/u>`!Ǘfmv>Mv7lߗ/턖{ƇCʺd"2Cc7PgvȾgw`XzXy<0?`0 ,»u=Jzץ Y6Y~\»%`-7,Vzkhxeő-QBP^LͧYQ +W*S@{]|Zw~6z_t ^z9[3u `tp ЕAKDʠ+!@ЕK+te[W=]"]q٭g;EV^dq^N E?Y| U?1O^%=C݋|,dmtQWCj?%܄$LNR&Rއ2R4))-|ĩov4cr'rߝ$d\3R :+Yj_+$A2)'i)y؎i] Yǒ( ~2E%g~ x $wh 9L \\bu}mP~E؊3%Ct5%Е˜Y4h:] tut9K zwݎnJvgÐ,}'ШIZR~kv+^,4:I.d5%HA'(FRE RNS;I£("M$űTiR1mU3 @6"\b`ͧuLFE2I̒,lV@OٕtW?A4—!i1T Z1@B-"F&Fou?,$ TFc/)Ey$5bpOMfq~[u|,]TQs~ҚLjkidg^q EWdZu,ZNЕjAWEO&8fn u= Ĭ:N |g HQg<>G=:O+P1AJ!8sY \]aV@%:Jzf=Ef%Hc;iLwҺB H;4rʕ8I ʮGo2rR"$gV:CW.ЕAKuʠ丧+F.YWpguǺ2h>;]JXOW'HW!Ctem;ǾD FteP޺:Ecf2p%r Z+ku{&t% ;DWX wXWVu2(oBW]+n ]K꭫S+͙. `Y2p3G Zyʠd=]}?t^!s7m>d\}dвc/htnAWmEO;DW2++te+Np;S+"D. qw+K3th<]Е فK+*:Vh %]+@ɻɮoBWLbҁV)Е+ӕAz:I0ʚ(3{{kP2Nqu6ΡА7壋&&C_, $(@] ?Fϓ_AɆ51{bdf4gWy1+߶E4/bR!T$$ԒikIe$Zh/ 'k˒td=2uxd[}G ~lMòMuPy8~Mdz=1rU™qozO8X!0AޟS5H^A_<,μego埲6V=V:H,л|ZtR?ͼ[5u\4ҧQfƈ*m17Jyy Z~KPɼ̴؁$Fj-e@MQLΛD/yX_5)1>k'TmfG f|>Y+_L-P<$y<8XX.U&շ%7n?.M.2;Ȅ|k}u2{;YG|mE ASu4kxvQcϼd:{k$l(GJ:)e>4hW-mCUwTa> N`dnvZFP*ȏH!_ww$;w/A%EG(fuI=C+d&_ 9otHJ?KI$5C르!x[M`MR` 㯶Ud²dmN|GY@Ze}5޾5G ޔ>+kzXr$8e EEo\f˘{/q~C+\Nٛhi5b?XILYjP&ݗhhlmkטbiǐJƐ(F'9(F3Xl!g>\C>[ѧI{}8E׎>@:cktپM׏>MWQ8@2oD:tp9\r ~ PJO'xVek3F]-[EQ* I8q(W1󯳔RZ".]ǠHTlPl _p{3١J,x߻KUei$6tFFZ"ߗx7E , &o:.AV"0FA`yp T!)YPO^%ξ_1Ł+_:\SZÊ{#k& 0V;Z6n}tpь{{!skgDz!o!&Tgqsq ^;-ىkN8z)n-$h'n{B o0,\ɔ+ hqۡS\`ÄVVX﯃ *d+AW5,.A;sJpI0e\ 36(LB?eη9qr })RDl*_汕n,/20/7%nzy[ƒkhx>ʹ5^[kͽNg~؎|ٺfG2TZᾹJjvwd&8LAާLj>f2ݯ- ·Y/e6埲P?s vjW/ԋgE٭a0hܕ]| FjMu{ /)+嚺]v2]-39 0WKB=eDѢgqzˆMvbt6?%5W] ˻e5] Po`MgM<$.-`<6$s}> `@;-@oGq"wm{lrf Ŷ [|jlK$pXu$U-= H:49f3$'$騢t4ysv28# _q6)M/&} ߜuAsҥ7'soV?:9'ϣA9M=|zu A;>a#7oGlǘOu<utG] u=Tt%`-Ē+ep$QdB霹Y$+Z 9XzqfgB-ث`?wV"+m6oz1o* NPo _j`8`MaNF")'8E&fn xpNЋw?ՒO!#`J A3$i}",7H`>%Q;DuD#j2,RC>ǥDL(; Нõ'5`n;r[Ʒu]̹^GO{:pW&l`%^ÆŷֻwpNdM6 ;YBivKYxi[gXtlg!768-5#_KĠ1"CC*nf7WOy5RLwUl\ l+ lں{AͷW>C+%7C9y]ft]tvcʷT5|y5_ ֘4qw.$6t[E=w?Vqqs)1(,ǀQ- IU,*%ǭABIk)l`6$Y$98 ^z+td\e^M8Xy (g\n;P cj,>j!?o`m U$ETQ+τT%FRxpYhf$QRW9FǹvsO[t*z]ClF ,<UWjl8q﹃A>{3Mq=Qn^31K`N̲u,notS IS٧e |h]J z SȞB+\GRg5BU(cus,$֥.Mh8N.cu#@]Hr Dմ'70LZ8/9-J>LSYm\)a_gwl *b%Ug-0bdw6(IN!haG Zҙ73S'tt8q_nxڝK=$G7z-AB,Yp\{H9*m#3ZD+N9s"Um3P1#r44pRb` uBPAdɱ$+j`KzJ" 64uwې ʣUۆ:;5k|Ue49kb"ML0 btƃA:2lY4Im9$M.Ke ` MhcQ k,'R7ܐE'C@]H\jG!oV ǫy6xxqq\ 6]0+E4RAEy*zo2 (XlL"$.0_\S.Dx,GHIa9b5R4Dt2هސRg@֝@_cFvaFk ˈKS]AI\Y LlKD"0ԦÌc$Or<?B&#OoJ frٟzrYG&KkRrSn θ`p:bT}< ^ɔ)^fl $#c<22D,[nh@] zh^ͮh|7 <;w!gd[G2%ٯ.sp%qtpʶ5]Y˶kcsBڛ?Pi~>ӧyYfm_'߶?x c@ s?llە@6s jimK:[j}K]@w3Vt6,?!&>j8ë@On\:Tg6dW}Ve ) +R m |R"g gKU oTNגּIIw?_w߾qf?Lg7v ^2A\:gB|篯_8o/G%H/NW4|IW50>,kѴ(ۤ4.ŕ:?skE#= MgБze~zܞ}zGCYNԾuI8F$eS9# ؊+o\0+ Q8eƹ,2_VX;km`a"CxS5qǛ{c:ktvicǛ ihԘ:Nhϟ8d}4_r[Н'{A ˹g5{"=ÿ{`ቯDi=IWaT#ڭv.&n@0iSm'kT9-UI,Oc9drDEeEkEaP2 ) PF;+˒U7t#b}/)51a<eca4^zXjj=H:cduNrk"lVJ!2JX+9ya6V6jl7T|?nĢ߻6=Uӓ6=-SlUtL5ջ"rUgXm2k% wBy fiK$&D2A;59 \M %IHlLdsj{jlJ5_XM3/T/|R_xuAgE=ޯ$ίX?~<g0e$攽1A(99"8Rb;aNaE4 LD-&Udv̺ ҡt:G,[{8=v^y,^vֽFa8rXd:KQ;Gf(>%Շ`Bd/y$LȐKbQ!_E D %k!eX$R׾08aKGzDS##qc49.4F#flHCy JrJ*EQI]ֆd$gB JԨH$4H#2BsVgGtUŅ묦%E[//~q#qBoFisFǨ 9H1)2[m4hs-~qVӎ}~ \rWX/7)^sN~U.i\Y@O94ʆ6R5 Jmd h}$JUwA>Țz#k깏g;& #2*r *]ʤ9PpR0@[7dG,?H:nVE1NQ6!3Ӽմ״t<?2ɱGGAGSO|)gk?m>j\Lm,Ћh+t*GG(r 46(bAF923xWfȒ!eN[#nd:$ fxֲ%]2pJV|fBWS>>_M|x_(Ѓfʼ#~ja9ώ2vD|nL3_Eʼn83X  Gɀq1z(qC~2& D荷+!4޴cj-{g\r۞oHۛ o_d,$L!M4lp!o#{Gva?k_ԧցI !rnk^e)!bYAYɒ:ĨX R(gC>d3KV iq! *yla-m#gaP<[eoE_.23" )x06%P^Ԗk˱/$r6ZsVGGxPtyƯ e^cPJC)Y0,gZ{㸑_Ef`,M`7Y#MO[ю8 VKQ˚Ɔq7&NbŘէ w8eђۉQ&b[ٸ oG`DBuaJ\M?7 N>*2:UEZIJ!kY!.ImJKJQ!! l~6ogz!O< 926",%XSr٪JY{/4 8zH{M\9h;1pxdx>6E["vc C4+T[-tL,I!ʮ*v['ַF?`ŀ.?  &ԝHؑn]@l!GKcx3ЇUh~-ǟD!Ou5,Iu<(8㳷ʲVo-oVNY/]T Δe)sMƉpd;Ri)EU:5ޘ,oxxRjpkQp'ߕ}xo,&mK\5]/_RV0_x=tgOWVݺyk? c[!Jz6WLjc1ݝwm Y=L;(rvr z{h5F5C[q/i^M\fxMWBl;HH|<;<EvC]Ѓ}Emyi CKllK4BYR#L1Y_a0d7O^aQ0?=U:iZN d3z䋓ywO`ws|>o"|v֏?~NΟ?y҆`C泓ww;~ޢ7Km""C|hqc6=+>o;);|5WNFiMS璚*-1ށW.Sdf:1.tyiJꧡ2(S"ltyFwg}]~_epr B%u^Nfa@~J:ڃ2.nnfuXޯmIMS7n3TV-XXak{h;9q9-_;圔1l-U7&**RJ;N=I͝s,upWk⬆~JW 1i9Frngϳ}:/]&/l5ݙt~s|g;ыerd'm pPj{U%z\QO-%g7UYUF([ OP͞s2A|ʨL&ɬCbJU]Ud%I(o+W⟴ cMۙ"zz ?ÂZs֎iokޙ6g})A-|$Ҥ>u/ ?x NUkۋ;l8|\;>LdQi67 aw9>-w L>bd탟4[{o!^aVĉiBNvm5^NU]%\-E fs!hSңr()loV'-j~?'Jh8WK-uWU4 jjjGr`]{'G Ӟ}Rjɴvlg;-ޠ6̤6fzSښ} REnTۧ0bg s$nC(.ǁ;JjQv W =y! 7 ")pդ5vI \I/Wҳjc *6 %p)T[zp86[I~c >lxpi Ѻt$pJ]  BWF&WV]5)y WOФ6iĵrc 3f#vI+&k6]A\ik/^F+F-\=C?o~ơŦLɻݽ@ev[g{2w" ۊ\f)2 /n$'-nGdzŅְ7FizqK%p}|S !h].r^߰34MOy`r2&1 KUԮ2sДe$2xl(] dVGZ>)=kiyoC[bu*m{`d TM-VVD4lDMAאR(;mZ}2GKB Y,!,SfٶLZMH#7٤e[.I\ُZiw$SfLim)htEGQÓRd;6J\3u46J3(-ֱiߎ~?KjrllՂHXe"&U^RQMBrYBn! c0V6Yu2}1ZsSJ8_S Ai!-MPf啼{/v<Ȕ-}BPdeA VM2C; b* /ȷ`s1 4Jsȵ:[6(kF'jWjSy @l!-.# V0T.?pօD2iƓd}GAo$ڤl(qsPjVȊ22JZpN(cEeD5 W= %g=DaZwca+z7xS<t'P(+^ln_|D#z=TK@-Efe/4'!̺\قd/DxT8%vJdRU"lifHrC` fh0 +@], .D#hM8I KT=("Xˈ`46ZyieR)E M%X/`tT-;ul*.Up;eq^U  ]&ܺ9 )UTU{␰0ĎqkhXgdϘcr- 0PdLfpJꬁ7̹(F\ \r C`P*ؒT 5ȔEEVp :,@]%4+MJɔ6 aB˃h`CIE~LaV#A*G0MO)e2gPc&gdPXbx\Fg0!.Ϛ`;&RV͠j`o\I1h0.`SA[54c+VX,pČj`&؎p\J@)IA1δr*% aa52`z ľN+%*0`pq 92|ܠj VڳJX= dͯp_}=UDbk. sʺ@J@O%eL` ˋE/ZV UQc( RD ArLah,K$ѩ[7wkMBԊXJΘ Dq3V[ e']/Y/F·<1ŊԨ̉0@ߛ@w3q,i[=XAeo5]@jTZ$ >:bu`-O*X_|JVb@0 |%LX"rR(KNd@V3OWB~7} cBH7 .yT9,'T<9bb7E =_uJ$ I&0 C!.3: tv!N^$ӐpF7 dv>#xLB3CDl1xsbեFf.pϧ悾qG|[mgduy $}.vp{wg*}42i6ih3#-b0 ۧS>LzR'?`zX+'34Iwh@f6 xۅְXGo !:CrX%`;i;#$ |gP!!s% 8.6ve;</ z<Ϳ]?<8o93cU!P[ ts Q),)b`^`BS-ps +6ftS hbG9}!6)z?a&HBuYQAˈH~ʖO2@]u|cȟk?~gO\/_H<7DDe^}5!qokPrV.#H-h /vp\쟟)nL#`o%{'AZl)u'VK/1З4NGYh8Mx2 BO=)ih$`MY?]!HB1c9?3!qV/ra/xqquppB̒; _ Ӵ9~7^?)3h^F(- =1Pߟߛ͇8߳<%hʟÑ޴i⪈x# E(Zc'PSӻwAOڢ(d N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@ڰ(USr @{7 Aj=-:0?H@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': :n@,'zS8X *}V'@H@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': ]'s@.QHr@8'CBw9 E'fB5 N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@r}̟H'O8筦|w?\4z&$@8 '1%>{R\BK[0.3u' W,8bp|l !pU,`&'WPw\J"\;6}f(PVU`g׮g&wdMm,Me.}6*a1płs+MF X * %O) •>Q Xߣp'q*cV\mW) AbJOAVj ل")`+\A-q*FW[e+RRW2Eq1C%7fr9}|}[׷n޽'ϓstTvwy\φWgM?/;|j[-nzzXbА/queCX@e_4\_>q( ؛/?!xWO.9{{}z]wBđ){SY{?P uyR'B::Ľ$|M]yr9ȸWo_bjW\ڮwsUt1ʛG$lò )Ʋ[ӜjdÒ64r4h7uWݔh.|4Q\k仰9ٚ`vhq)Mj:#斜='9+,7LCXl[L`< /1C"`ʠ"r%{A%/'bALtjc*W[U)EI{gz1CXC\TdWq)7]5fMnMZjS:6\9}7,WqM+V\bWĕ7!Q+kקfծަ2htE\#A &F+WP︂Jkjb}  V XnqRpŔq*Q\mWA:$'b1HMUfdc&IUarYn(RpjSkW&c4"23bpr+V^qA\s%81b; J:?ý,adEp2jWЦ6dqnSY:[ 6u Wz2&wdMmWjr_yt(W,w3ZԮvզ؊G9FI 'Z1b+Vj ")`\ WJb z\QdAb%1K><W2+pɉ 71 w\J+ ]&n46GRn߱_WY+<)Բ:mYZ9uY$k꧇gk?Tx` r\RRlVUf|; sbpr+V*Q\mWRZ(g-M$W6w+$rpᙡBvbm)6qdMm^im9WAquߦљʡ5 'mrvb ;X%q"" Wl}+v61+VjbMO#WPLW&q!ʂp˵$W6t+VIEqA\QAb%1kW6vv*]mW1 \A5F Xvj}WHqA\3 \Aprpr 6t+Vu}:#)bQξ+\A?d^-d  Q Xn]A탇jRڿ\ɬMp\PܼgIm4+-t]QHquߦM+LN XntRpjתݦ8q{r]g9-Y +VjBӖS+\Tz,VqA\ "'\Z{W{ڵNoSujƤ"icةprS+-+Vtj&E$"1]؁פMוKF 9WGXʠ9!-?"HS#AyMY ˈ,@TGI [{_\~/~5my:>+oW'2?[T2~.YC`Vtryl$=U`g,S\z;Xee-XD =1}~Xm2 **\;6}|fl=jLf䖕 MjWKT&Yѣ؀o[(%N+&_R+VC︂hjr*6 . WYСb+6+ F Z+W,)b}t*FW[Upl+KJ]F[zTFWyUR2Ȃ+uK}Sm#Qp=<+NXnRpjcW㤸R09HXpr!)bdUv,Qp/YU,W,YjZlbAڷ S+DWtv՗tǦOLٮ_IpYn\r6a]m*cgkWWIquߦ\Hpq}WMrӣ< Wq9&qr]ApQ X.)bk=lSjh Kpbprզ;Xe1 *Do ]]A/bA +D: (F6&d1q:\QLn M*} *R z,v1d+x .Mif]:*MG+vQS7-J,00XQɚ`l]b\9[YC=>歷Fn1N I %1+6qzqS&"W,$)*>"JvJXp]\+V*FW_>?39Gmi\&Sֶ65ɵv嵫6~cyTRgosMos6x[z1b!+VIYqA\bEAbE L+V6*6+_o1LiIeZ97/z=\M?ߏ?w5Wޞ_7CXvey8a7@!Ӝu|ſ_O' ֿ_}H~!yyQsio;/weo۠繾}{~_'/c0|Gx_Wvi%'Oy/·7`p|lw狱\qNf\qpަfK 2zGKd<{1zvc =\λ:bz5<3;ϗ?׳3|o1e6vi,FSkI+t lY|S@pMMn >cȒVLWlIle3AlbW7>~Yd+nqlNi.2G~nWvVfK=֞Y:egWm\X}=([gcixʵ7]ϯ]Ѝ> lJhxb:\⏛"> o{.lz^k q:nz|8_׷]PD^|ACu4"y ot/*oFD%G~dAq$[UpX({ ;*]պ0:D_]CO}ZU{?PEdFܼbHqFga~+`R5fm8䏽\Ph~ nztgEq޻S-FZw|SΪن˟9%#ؑS7+rhs9ߡ9¬ |m,DЋh`'t*u#GKڵv9As084 ĕw\qH,r$*9#1%i0$"-w1ˌTkόHH%1E#Ls&2ς.:є{6Ɨ6\rb{w _|jN3~S8_/tV+(Y;SOѱȬ$#C~2& .肷Jg"hVwӊnYdu` {Clᷝy}cZ/*l_jb/e5W+eo%NPn5csn6ngW]d /ErRR8hB ({aE\Au/[_~^O_>ɀɱ6eehjIŨEU4{C٤M[|fODm}iϠHu\$F.!pl$Uґȕ,y)C3y.yIk)lؐg i%GoMLGf=iWYW3簘oh|l#NI٫)u| a`?Nr{=Qɻ ,y֢f6p0(㹱)qEieZkcS,16ZsVG"< "_^)K4O:%k&2+ 13]HtVl"&`~)mmq*BcB&Cu8~[^[Tb5fuO3b +3C$蓂$2#j»N˨$dNwSxyg't!yP 9b!%I*%9uuu߮@ gό/>H`0@4,WG&80L5Le#@pPR.Ez;&ġCF݋Vvp@*X=ėjx UUЉf6pd?rmv_#k'0>`9qևPpQbz_}ݠ i*=$8<⯥ Fw2 ןEIq[~۟XIbQS{Ht~_/oʧY~VwK%z:[OE?~_s*5!: Q᱑(b$Ia}i[~uS֌G"f x)kqzo?x8"J7B{M uHȜszdy=L dKZrsoz'<ǮjXM5(䛮N Ѡ)LNn}bf0o62{\_)wxJ6ėefn/kӉ3Jj_}tHK/REZ h·{?SfsجRiu%|u6n毻;T?֚s>AR/=Kox)"fgsHMZi|6`fDXϟ؄L6cJT77t_;y&  >Iԩ5a0j`Jv﫤Qt\j,l&vVQ %Vl0(FƠ:^qlsܟĭ3jK*,rJ Ü͔1g"hs阶kԍ%UXr*MVHbgvpUե6=k0U݃zQ}sI2f0!qeuh3ۨcV$*(y:W<0 #ot}5߸1hώA;+.&}1g1{WT'Us?Xwi.ώa k &-a+kgw Nڹod0 mäe9;*/\_ -gct[%r dH8T$w>"(˒U$P-C2G)Hd.Q*g28 +jq/d*M>~4bԁ 9(h;KܪVՓs&e\2HET |Jy2P.V!F 0ko^˜ÁomFOӲ?㩨j~Uh>8vAj:lϰOۙ$*N5 zvwϿE5QDh-F_By fiKGIMFe91VR[^6eX=XP&ex"es`@I j#c5s#c=R yCPWBag\@\lgy}Ջ??el#2LIJN^yÍ1A%J&G K*!H +:Yc(b'D\Be%"ce.Js̢9;GiCb jWSAmݡv`z C F.F.EuG}bZR@jȊ 8-+m| V$L$($dQsL 0J,F}eX\G_4 Nz\QɎG$uُϔdP=b5MLMpdfiVl y6ݝt>qZԩ=ЖݹLԎ%:gd#tQCr# (hA,偓ӑ۬򑜒7 7FJ4Ms@Hh:熗 JY,! fN'sHh*P2db 6ԧqj*=bqE֧sO3ǐв"/WT:ūTJ1O#]+o23 1Xj.Z:.rEɽ7Zwqv{);BFO?ܩ2.BKkN&T*ȡjy0\p##XY Yr-Bsy,P0D8fΑqoAWqԲ'cB *b^gZlHb *8s:eهfɠf !p g *(A 铐AG|bYy_ڟ cwTՊa:tL"1q+lDg K2$(@ NzGu' e8ImlR'4mHjN^( |5]LKd,;3jΈ,h`bւmꢭވٶbSvҢ>5J\VwDx0Ԩ Ai^'XH&0jp$g"ZIWdV3WFK]Jpw2i"W{|&tLHy'??+.q @=AL C̥NKyQmQ7}9PJX0q1r S$cF4 =PSb%:uQ`l YNZͩsU7N!GcqU"VBguh_QKsiZc>WcK:IZGMHY L &@ NH)mrڷkc^0E=k:ݮz]cjAkNmq"/fVQw!){ʰd竓QY;M-xy csuAmniˏ=ZwK4tF+ %e3,RaK(ܣș[c'#c >!g #]vk$t|뢭X}(^>Zdpm  K0vyF~U6RV 0yQŨ|gx F(޼m}gTdr.iOŗmJ*X?@/1 wխvL`AYuM'QYc ΘbQ,t 1B`6rU;JAoKOooW^c_< -7|J2^^0s+rGtS\_tsǸDh#ε,$CPlfFX0g6`2'՚9hz+OeObnL^ ln;Nlq{Ѹ8q<]kyYGjwWS ᾼIby*( \>rՏ?:ם]M>=Co{3ddKc}<͞ 2upn&ݔt#VtK<`oޞ3]yme>u`8UT G` 2Ej פ}6i"8-8E=hw*WMɜ7椰9Z3THTc4 킷y$4J%YìN25gFΆ|81^opR`̭?v{'GYx|턵v౛zpcpnk NYs2ɭV3\_Wgvsz7ן~oRc|Oyvf❅aB^v`[~VFFIk0 zeF1ͭAmH()xʹjzYd鷎>R㼎!iMmXSwu`s `"N&SO^@$^?m )/7i iICלgs4کeJ4R(ךZ )B@׈b'^&V=QK8j@]pTV@=A*k\ RM[zld۰.x#jP,F"5Cbu?q(el3Г1<3*)f?)[IfƜ@Sr0U4W#ތ: ]ZEN 3{0WGwH6 ZzlϞgk83!ϵ8mF* x$ i#q5&ɹ=Ԝ[2cV:hd^,3"cG؋*tjJVg2~Әu6B[Bw> {;.ڠcP~6pU !bhb1AK=\p,z$*`yc\{CDrH !K, jk$.iOu@0 T㺻Fg*0UuŞ^;Zj2>lƮh,XKM/QbPe4[d:Zq&$ ('4"ǬC.;1!PMHjmNǤJ] B)u"S)Ua `t&Isk( `O(8 TԌ YFFz@Mo Eyz1ӃǶ&!g 8gf|nb/ pQLGS?vJtWXPAݼ!,FT;`. D j{w:--ۡιyB 2w=gFN#^jV1U9Hd3 $N **hfZ#D":Mju`<9X#,gNϝ`3& qA S>?Tn^G-sUj|y/XSc%_zeYkYxwȍ:orRRpJ< /H/FYjK8є\4iU@|i{%: aLOT)JSUp˓6\x! $"Z(>|=VK؉#;i6>,aM_a͙Ww !!+PmaI Уc>0daoM X4F)OqnaRw #l16#Ad]#H R9gy #=Y;]7?sq;%RZ?pvZ!â{Wb&]|SAy wLJ~G+c(dG?%Kǔ`0=K%<ޟ ޸il46WK-J8b=وtB9jrs"/U"sQ 3kC -: ,]/EϖdJ&WjY0o"pƑ9?ot 9y10N?^q[H_̠T9\}(4ݞ"kutmyjypEٌIhm휣X|?/oװ V`}.P1/唄h+g؇ϯ;f_toz&CE˙:Sc+N(+0XӾ5(U C͗eL<{CϛY]Cx mx7W:\1j&&*DTj#тk$˘-)M8+YN%&` a ~Q&)Y[u,k̈́W~[On@uuS)G2zLZcTymC$Q$tL4$N7Fs{G&!zH! ,-P)4ƘxI= ,(0sA J0尉'{0ᘬM o{Od}뱎xb17tUGG9l$۪<ض F+3C&3CW`&=Pa9 "(3䳮O}7=q Q)aG&=tWO`bU&XɣQW\-E] CWWJu+JU&XQW\y4 T6TW-E aS$2meX' h r  K'-=)_~I`p2>&dݞp)8krpԢmZ_YfvK_7;Smio?.9{IT`z K(kX!TLl,Y;rM:/Rdda9.k<3V;w1|hW;楊sFXuUVl|vqiJFLy0TJ Ocaœb:ǫ5ɑs:z*`vC*@ۘ0{2a f-mw[ݚj>b5qr2<2M8_Ccq-i*jF#dt!)`A2"x%c" gGP.BeN}q+C_zL7>rg㤘Տ܄nzsX/Ҝ4xyLsgI 8& wFIfa'v]Wk_F;0Bg0Vkpi Zڞ{)Rޕ#/އ|fN $;c<&8A[i%ؒb"Y%ULlĺΫjSF4$h0JZ2(-³@A'ZKPrcPBHKInD:ikH[R]#zj1!15gZiSQ).4k̦Ϥ9ܤSJ}vKǏl2>:8g]%~Ju;3emspsDΪ+ҽQixlExe޸wCi GcBR" 6*b""MQJOuд[(pߜ2'J{(üe W.QQ'҂ۛ{ӣjаmi%n{%h[ D[c]󡸽vH뱐VP`LX$Ԛۖ]ض1g̮Dl0ѢsoRUֳTHAH[wꬍHGK""[zmd>5H|RFm:x!i@ -eN"1D(QD@NJx4$hbgc'`6Th^ B^ׇQ dr|_ٴZ;eH)(%'/6 Lj-P\&us1\L$Qܹ8-z1XTX:VUtA[GsjVn[u1@ 1P(z༷T,N%u;˝^iT.&m٥'\2j #_#]o`/a5Y+ߡM'S(3[Um操2*{ѴjӯS:+ۙȍMTYtʑ@hsӹwfrdV.r,*ve]{s8ɖFu_+8Usg^YkyM~#? K7H1h!epZtKó{ː]_,\̇ ֜^u P-<;ju#Bq&1}o4"2)]ƥy z҉`UɌpėyV sy}:,[G־QI76r+-\(M)ņ`9Gs+*HNJ{hM-Oe8/'ș 8>°}7&zRvE>}?XȡόӇVn %'NR(/Pgآ;c G@[hAA^krJR2TCR.Jy-{gDvw `X<YڇzA>#(2D@{šѢ#\HjQB C: *RA#RTYCU XqML $p CCJ%5rK8)a|Yvm3,ɲ\-J=M:Rp\L.TKS ʠѥpHgBv&dgB ɜ $T?L[D*9(51 D)X|d5!mB1HDq]D5ϸ()Grfde9; ӋFѕ͆cXFϗÃfӘ_fib:Hx2"lxu.hY(asal cdZ}DjhI0cm0-˫7KX9mO0Wu$49YT@ 'G&MF2eísVD)s*D*&. ZS@# '76 ^%LMϔ-e9;,x)*84yPϋwE ~߹h{ӕcYcYk]hr|"蔔 "$"@@B1BIkI,% KCH:~H~C`&WhWk,G-NAhtWFe2Vi4% (.[%p/Ty*#~@yfB_[XwaMA33GIƬ@& Q@PU҄}BƣV?OZv$c)9c"/%+.K A.1ӨMrv> Z{kw;2kng}(~~~qɿ;J c4;Ak.)3ë*n$ji]x+ea\=I[o\%SBy?"O;D7ɼ5A5h )(9:ӊcM&pG)*#rV65Y)XO] b)w=k^86g`0SNj"vꞽb*E2%O*Mt|}URKhpuv]ۿkhFaCV%8GbmLE]=o3j䇪|Upv1%1Cozn]FiM,&ӝ=)ؓwt knV5 bQ8yJ>=YٻM7 sglu>Ȧ^*Ћ^'D2;> !7}9A#bXyc(tC ymXoP7:GvWO}濯_N_|2s_/O_W L!pe bD-H,^X(jd?]~x1&a__ϓU4` n8 ٛaD0r\~Ow-x[]ctmlwW6wz)"V* h6/h# we8US*Jj4+><6#)8 $eЖq~&篆ѿuu0"sBA{eW:$Mk-.IALhMg=\;$yk !s -(4s]4g M9bH7^9d@D 9-Cv+r[ݑ;eG*w6m,s Zz(Ayϙ.Rxi%r<&F&Ҝ)Cp"s֑|BFFu`1*%gTi/2/ ))*F#DDC":Th G| /Rij7"qb42b0R5j,‰zK:Qg Ba[#g:>Nܭ7Ϛ3(G~H>vOku؞Ƭ%jA 7g~^W mv癕w'%ԆX<^xO+,tZ!?jPx;7̯. dCrx<J_|6}7,踊zś~xx9*~^_ѯ'__jPN ꯏGI]0i92__~?H@AiBRhJc( aJPxSV6sW+S$IKL,Ot䞅@HL稞xkP;F뗘 T0+!Gf7Tˊ~ 8?QHRȌ?"OP=ϛ f% zqeJ9-YL'9Nn\t|6 )Z#1&|OnɊ &pVaD=)^OQvojE g-dYAQ_+"lҍ-}wS^GI6dϼu}e,Vɪx]'Br.#W|J,ְԆ+R Z3w o2\Opg 9&:wL4A뙁ȻR{{Xm0z/79lLO U$Q`~af¡|KcX'&lb&U  YU;nƬld"w3Rzg6/+^>%yDHI %#c8+*f:8?>o|Wj[]"Tl;rݽJcrd7n}lZ z+Co6kl2k oBy fiKMFe91XwLW>ɉ^6eX=X1( fٜ7H\I j#c5qGz\VbY@LO 3^9&ߟKfr>duh`0:'ӯa"5){ c2 $J&GpN%%NDv)Y;aŠ,0()Z] *2v̺ ҡt:G,[s?b0]21w]Amݡv`2 PYKŸ^g >jHfc,De*.B&f%YQ!IȢK"J,QxXM5^wkMTFD!bwx+Ms{)[B#h!12֌{%%A\ ng9cHQI]ֆd$3!h%G#HA B#8#b{0|:MqVEbwʆS [QZ1jel$Ty@B`4awMUC>s(jvЫ#] |qF:;y?>v#j* Ym٩`o%sqD%W|H>[v]qPվL ׍8`Y(b }!2P%rlPz:q:6;q#Rrc%ubڢ1`MP2Ȓbh5qpt" CB-\ J@K?+bМr?K?im(dq{"j?Hh9ΗLojcP) Y먒O, 6;KC^дA|,jUZq:NZ*n#ȍep%":kX+aLrKजwԙ! In]FGة&FۆY}[QZ ȌVgjX*e,GϨ;#p΢UY whSmVDH16%uΜQ6D6U>^Y\P:52BPe`5S d-:p2g"Z dq[ꯨK2SM*k|2F?XC4#;.]⇲pHM*MugEЕqIZF Fi"W(enf- u&q6ry^y~l]qDBt@ RBeAkhL٥L8JJ1m#mCDg9LR9d,H8 )f6eV'7 j%ɡ.ERyGt1%4͌K7ߛo!SxΎs5Oư+$ɱcGOÎcd[>;)/o1PaVR6i"GECSo4g`::rԑ%G(r 46(baF923x  Y24%.|(rk EG,"/"e?6#/9je00N=JƍAEf%>5(H .f23s÷l]K$[j'CCNdBYh! xn%nK0oGWEn IzRjO00LZ{kS bR}4t?i?yYx>|>йW l?2vf0`ࡻPWV 2y!Vz H]j= vnv^peKӣ鰝Vce?GWk0y+&V]wajvwi3no"0d, 8 מJ# $m#3ZD+N9sse0PT LjXPI2xKW:!(AdɱT;WF5qgW.l@rA&/XteUBL4R2&c(YZJpB2lY4̮DIKc.4qp:!u,]B]m4BI4L{r2*uMk87IP U4ĽUz'R}lTa8٪pYGn)Xg E`o-W\Vj%Nx/i>5Wˊ!N{vO3<_ >~.e/k0޿g^\rYiמZV7Vz׌ɲ5)r[2θ`w'<ɔ@x[o5C/id1 A2,.-M x0pPF;ן'Lswhg<HD-jr;.$G&{qo qVm\qTRQ.T;rԻRӽeuko>/LnƀAhֶ]"jßN q8|%[Ug֒UbXY1pb`i2ECO.wUΩE%W}R'eG:R V4|~[ W|F-8F/䊗j^nb䈆~7߿{[~|};x{o>{Mߨib[Q/`z)Bkiښ 韯$3z| uz?L Հxҟ`efEFVTޢha:E7Yp\e./B,q͈u'mPם6U6qt$z+uVT3ʹ'Y ; 2 `7`3ğWhQRdN%fP88$m.]L/%\M6 %I>e3c$%e1avFc,p-VHv%=h2Cd>eT6lBTjc+ɷVԳ }[=wܨ闭96&2j;Y/GGHX=מaR"bN,.|<ރE`/k'Ah)Ś"3fs&o)6:l_v)4:mqy%xBGɖ uz_C*_[~@OR{$aT^,N }:g[==[2yVL|NLEN9s9sΜ]͙P b4LJkE2ݹ!d3|aScsXg3rqoN*aΆ6(ը┚1423ȝWt~qrzv݉O/{y 5^R\w>,X^\)%kèvrDMؗ*WA!V(xdU٤B-JՀ8~?΂K2 B BEb$NQ%&r 4V;I[lky3㳶ؓ9_ NT4RTޠ+feHʒpeoSp8ngSv6)ߎ)*7;ؗ"UXӗ]s`6*dqFBl 6 gՇE(nQ %3`"5~1 l,^^mQVF18flUUc|c߳P;K%$v-;6hYH=Rl 7!! q -l!nzxW‡Wv\&'c39bkt-\*Ixn62@} ~yK^R!;*}k&)jbJ uP>VLQrv8Y'}9uBk}/X"G8߬I>V Ccr $f pfw9 &X|u \eJIa,Pu ƭhz9z;^ "w5>l|naF>bB+sA\)jɕ@Lf,yy\6Bjvuدq|'֒_䃣OMei9Yr0;< }#8?`򜗹bu6"yR\bb3~ M98vbO0RKs/̩I Y1^6$JX*{h6-&ސc},#wm?{y pY _^7=";o#EvngvHȂŵNoOWQayHN:)$:=.9"ƷdGZҙQ+FljRr%bOȊ"8ۆ# 0ZI351:b%R^FΣdS3JB2x'mA*3] vrxzV):z{=Ojo6-i%8L6~9r!*jb r$ـ5`sYIhJ3Ny(Ocl p6!ƻfoDЬf)6\me%w/%p>`koI JFlI7ڻ TҍޢtFtctr\\R^?,$SbmM+z;0"bsl_k-sS~ޗ}؊)݃+q-fNEVoE!a(dK2g{bIiÃ|`^ehwO\eޯmTѾ.r{}:-qwJKc ZPPfDxDcybz [CLU*4׼ XR&D:2Q/X*/>P&4Y?5p=l_}Tt^8^<2r)De_}wtvp>^i~87z%GeOodžQp˟S[}>s޷/?~sz|~ߤ_w}}Sob茵xDt:p|pV~\rM9^vAMFI ȖəR65/]Vn{-e,qKfaurʄٖbL߿ǃs+) 5ڽc[t<ɒ`\ey^_Ym{zJc<+d7h875Yɿugs巻X\& )-5P!=׿7{pt5Ϻ? ~uk1Iw^} \a<^'Zލ[-/??R$ۀ򊧷{߿| .߫^6UZ:?QpKte?:79 Gwϼ}Z97|Eh)Z yPTݡolwѳx87u!t~tpx>@@{Cւu0]gfG-],cкwI'"Նn}x_:o93;.VÄSt~ES|se9wbǼsx7O<)g:ou"4 Gi H&ƇPZS*1<$[N3d}c~Em p gзXrCE 3I|6 UgOacz}jکtƝr^q~y!Ǫ'\{#@|_;xd_i~zR]å^ś/? sjir”Qx2MMa:ٵj Rh}h`[/yVrSs,H⌨.QDgc*M sC /v@o g^';\K9sUJ IQ'b>^Ip0֝[;`Qh0ۣs:iA25s(K)dT˾ I0Dq}3;CQ埊bJFB=2;PĨp|TwnQ2_理w;Uf8y*&(Yo]}|7<[dmn-ϴTЋa=O&z$*oߨ@R`)AoQlq:4dh8Ԫl'5/M)B| jVYܚK$Umw)HFaَ0,62a0 ܓ3 ˜ue]Tj0g=^!\SK>%[',bct9&gG.UZ "؜ކ.Rsh;lb\]3uQ\ H3x0bvG6:vÌ3q;622H=R$l*!F5jBd)UtCJfN2Cp )T1%X]V1J?H6_MQQ-Fכ֝x^wG"" FDqFěF|й=U QF\@z!% Y 4Lf r;Vru5v EE*i+67uLz`Tʊ ]Z6J<yo ʶ9լ0q 8H@&`U#j DA80:3lq!pqWp+xa[V^9rjN+7-fG~d{?ZquYy2Uy@dtFajGaʡƩm4fȇ}!x=]uP\ۘ{'_(2Zk#pj@W7`1o'R}3QctZW1ug/N-F_Nr!W3bM1ЈRRA /9j]NME 2W͇^> F1 6&$>Hyyx9ܺ7˭Rmbݳz$]hb@o&Mя/\/N;Y)j Rl_ĝ+w'N}_&z(| F.XmA sgZ{۸_˶@bXIf60xȒ#qE{8Ŗd[vy!sxx'  JqTQ( lQtqhaDb"D4P5qnF9 M)e >+ ĚKE;vú™ G#&cSD2%X̕ݓ-PU@Rr+TZHɹ(:3gP#e6ri)QKbh(e(F4<m!Z@,U{O~ގx~iI~:y7'!b8("W.Q%ҋ@G"YD0Uk!69ik_cս5{J#ݲﭶ[!ەPCekM{hRuQ* *ȫ˴SR*Dos*T[?l* 贸UȢO⊑ay<@Sh2nl~ uv N[D\Kx٫|('dci15b;gN۵b^+k)K1A}0XxTQ͍Mt&aw"4ʷ1k[>ޘjt*@M 0)41aXԙ5<10c~ ? InT=}a}7>&9{W^]сK?;|]&_R΋A?beam3[aL Z (jUϜ8;g5p d i{@ZXI.{fhus=L6&kflޖX 9b \apH & ʪs#KuRqXt'me,!]Znu O }AeP󀫝&2$Up;oB!ń3`NLj9+h-4ZGYc/I{<+=~r,3/o mڭg!'M,l$LH3Sǀ/5E^y.lM HPc<4o%qHx#^#ʨSl)B nC$]b Z]?\zF~ܬUUX/7]^:ܩ+E0Jҕ&ER?>?.䒻vKxjz?xB'ܜmeoV6Ѻ~ZӇn,U+à_Q^sw4m:c~-εY~Cuh3\<47.| g3?!dî,lh3mΠT"B7wKl|-ŕlyVɍRJ*>zX;@`Ψ$.F3њ[gu +b(uxWUVmWW %Y]}J N?ITu0Jqn4,f%^c==]zd3%9nT_,yl>FVrf-+#G ap@{sX|œ'F}ǥSO w}3AMR:-+fv4H#Az9bB1Ģӌn%PeH puOrͶ9IsoI'5-3Nuzlﷱœ싾@?{iF.߻'Wr:{j: hqf̡kiē`Bg (;{}_]Z͜w}UyI0jgÏst-揽9{-TUi?1,1â޺M7ڦVgB\6[g|]%|U݇A3`f̌Q03 fF(̌Q03 ~, =X< Jqy|$7ɌQ03 fF(!fkb„Ȍfhf̮(3`f̌Q03 fF-] UBfFlfF(3`f̌Q03 fF(& RLѕ"2EW]+Stei11,E Fbg%qLً3)lro'{堡ifFbf̌Q03 fF(e*<3 氥|&ɌQ03 fF(3`f̌Q03 n6O ڏϫ*ܯdlΔzt~yFkƗXrU2LBi+>HFĘ@ QRI\D#Dscቴiej7ahreqc*h%{gqoR€ќ-魉Ʃ^~~3xjꓭ8Y r{BӳO>L0Ӥn xuCxUc4^m&Zfg|l%0o IhUDVRϼ,2:r8 =_;xJ|Ae%Nx[D(W*E11s)*F:Rb&y$tViǬQ{)A@8DX/5^#ZFۭd?9f$IǕVo/n|q+\E7K^\IkcN(!x) 'iiqVt k"FEBNz~ÜHόu0ɃvE6S $7IEdW* FeT$fD00<̀+v>V*!DZ,nd% N'I*MĎ= :Vp<3RῙIhix,:=%EgH*PإYC᳅ ,1$^hlHNP $#SsܝWn(v~gqIǂsR{Z$K>9 Yq 1 hXqz(Mq]w4w*S+\OQ>Wol+0s 0EdHAxopf_#Pg^(#+:mqkr 鉸+tHWqE7s^y7>pYy>ŹIKгGͧJNkG|nʦEM`625i |08B1P̵~?܌r:btZ_x1 # sftz99-ӾqP|1~85IvC!&AΖUCWjf{>LV~Ygq/յ6\V뒶2aIs>?W,G?ǠP2&^R1No|pSW/^~~~7>~1&/߿~?FUKE'+*K&Jj/Lx"$?7_@vX1ܻc~][@=VO/~^jF۪kSnuQ$m6yI?^sgR! 0݀2>Zi#ͣ;x%8Uc,JJ*4)fGPm]ecE0?+à z]EJQ&/umHm#!-N'P %HGBd Ȫl=wфe5FɶsSBa'#&f.aSP?0q h$ϛPlBmG{[cb9W=iǀ9j#V Jz{kƹ`1X9wc qXK%*R2|%gSՄLD4K6~6ؑC/;wR!Ն VTs>JmX. h'ô(RdKJY3wTtnYG CCJ{ZRI=W>&p[Q41Dk$!RD9DŽRn&x92J#"z;X+\$J!RHDcsڦnޚMZr#|&+n [wzSWaPVHh"φOSg4V:S]^KپiCQ)#% JPNVFqAs"'$Z#%9]R@5(:\֤XX(A=IL꧎ʃEVgs SRA8FF*] \DTHmm斀j)<uG epmhĘKה ĤUyxTQdtx r3Oi:U5U+I53#hEkmD!B&pkDp, jlG+W9ZG` dd`DF%X&dhH @'n#95G;ikoEik`x+NEZcm,|Tb7 _j *MAz/ϼQD:"!G[pMќ ` C2c-wt!JkY::5ؤ,r?>" o=)c̛ 8xWl(|?k#yNΘR"K \镐Ȝ3uAW=w{.:\Y9 l^L2UqTX4٘LJ樔DݸԹgrnPbD-Z)Wc*GT_@$S cݓvd(Qpx^zɈ ]F]d&}RvB>}Zg]a_*[,o^3~5_$?;{9i#E)Cp"s֑\jp`1*%gT!n!n R*UFG4$Dty-Y 3x0J{MZYH; qٹ )j FswVaf=nExlnp7Kx6@(0oD"9 *EI}hH.bΦDJHLrΪxd!vгz:yt<0R8uDSLS" D(HNv'. 2߿PN[ 89{mgO&:'PWMdҜrF%UF7Է&N7ybThCquD}RԶe -Kmx1QGAiI)GE̙JS@sGҎ't00UD(O.Q9Y*=A)"jԯDYM'gWa̰1ydXX<{V|`|n[?Y Cd%C~ґ3f(rT6|5wǜ w(~d7 g_|n8o/_Qz_Ͽ{' w ߠ~d4>Nml5/ʮ8cՇ{ctZMz!J_7CO0æ?*PdPPfZҘ JAB@ %(Av:*J>zEXr IZRgb)x%,B"DEgz_r?q;\ Cs0,+F5U'2@fC7E]n2+)5eH#+Siɒ$P_5#+>'Lqo79B)JBYJ*ɳŝzdcA.Z Gέ'=*Fq'oe\ՃM١9QdE00c. RQdMMyO~lњ$*<*XUXY%nu>QAf,cv6yfJSj+[.[Hkږ3?ge\<~n@ff40:r~b Rpzʆm[\zo/a&wmgK擪 L?A;ΓI5){$*Hz=&0"07YӏqrOFFq8Lqà4;̋‒佣/D e%" YZeN|Iy=V~!L:Ùx>#5-5E  S؅rqi̧'4]WC5=kQhA׿WO9[9~ÛㆤEO[+.ū?6"ŴQ&8. gZ{ٕuðl[+=^=1K>-28βCoQߢhNƃ8{NIuj/C/~[e7Z;Ͳ˕K6.岘k忖,/(WEj޹{Qram.p@*8yk367sMɱ+4\l?7=C!?熜ur<1MM?옄fev= |⍫{f5}o']03AMusȜaV׳h5ax3mz3f3+wvō+ .w݅X[DmllUZ^=ܤw]jJ@e 9k85*" )[)Ա V{PkCmK^%-x} Hp@MRJr#Fy#7 '8'Rιr ^fi){2=Vmex" Q]"I$")ƴHd>ӇGb *>-?|QĪɻRxJD77(}fad20/%۷  `PA@Pw @.DUWJ&fP^Pcp[* RY(O:D˝^嚔-+sx+!{-*bi)|iY&__nz6 MynP`iП9bKMryN3Z22VxHν4]VhvNnx8 s=,&W:(UBHhcU)@"A5rGl?5s_P5Vjw/ Q# -RmLp$ ZHnwYTVcTo$h!fHZсF@b bp>&I@krqk\Q7leD"vx3T8X0Gƚ6)CA &M$i75 6)MyԂ3>`dGHi*As89.y|:[mqѴF¼mBgaZ4cbF,(h Q B`]akܱ-ږn;k@jv5Al?`ڍrv\xOޏ)ޏMn4.ɯF@)hax3nk<7};O8}<3&g"Xs_P?^ ܒ\{Q+v Vrg0C(S#e.X]Y#gsW>y8j?Ѓ ]&M󎇛3g=ay2ZF9Ta"czPu:{Ky9 /jqq'i =AE9TTW`j.^ 2_Nˇrcq6//˓Q5ɗ_RK 6=rP*Skn=\e*;+-ly0pe2R;\e*5{+zPpe+胁+RLW\x"8vMd*^4i"lA~bD>[P6ƀח/kr1F17ˋkjUBV `nt6WiŭBRG[6L95!R4ʅmwGͦq|HI}9.Vdofj6\J-rF''9&^JI=J'I(%ZDk@%IIJ!ZId-"?{ƕd OءuqfwdfM669I THG܎${N꾧t Y3+$"FzyЃ{) w{pGK8F=|ԷO[dY=ZǬJkw z$d6JÛTw̹c0KBȅ^T:ރ%vwfg+Z'A裃{pǨ "ou;?;M{"Tj\i/Nzu V5lbqM x}1>5qsIQU-wTM%9ѐJJ-Atr=ʜ%Hn]طJ0%(mZdjڤТ-)$8:BOk:aBziڈ* P )HU~i! C6WKr 5jFEಬT uWh%Wk e :om0SA$ Ex&X@E jg|Wuk@AGEGܤaB?%)H ̚` U7P r ֑tҁGJ  Lpf+%Ra7j,7ݶn4Vkx5{RRR}Ju yիf@!1&伖H RD A Ք&dY\d`niXEjE},B\Ӥq":38onPtrng(f }n"磊1S NR&D_ {aV9 L|0߿m/ؽl_ ^]-zDt$MHc[oP\ፍ`Ӧ d)h+JUe ,c*Jr7z_fRcQBgsA9J$rAV 2*2^i C.k/A_EYьmC5YK1Z1pߩqcU1vnWuܿ ZD>"CO,[ 4"(Qw`ҧ5HA/&sP6DD7F|̡cLFuu ((]LEQ{pYf}Jh v% Aܱ- %HWPPl50zi NmB[viX<ԆMa:J)Y Ǯ6 3)Hd(nfWbe@AJQCEdU5\Jb,*&a!dE(3A6 ?Ftbl('JgVXftXIwH'Y g#NGh-R ,"D[vPjj5]ZzVE*>j,e4 %lئQ}%e SaCj)h\7cs|i9ZYwi7ղ]b$YM=4 Dz qu*ipDnc+q0`mh Di Úg)DzI(Γ˃akBm2&Q= #TO*Vj7% xKd]ж+9^PnDDh6?(4(Jj 2XA2 5@J'2(XoڬGŰ%4v*'"($'Mm2X4kn3`'|WLF-LB)* j#XQTbw ;ցNi,+~(]ڈJ5(Z58)pcօ 3 E5i+M3d8N֞ +JOQA,iJ C\1Fnv5E!⸱nRM5墁Uvid" J.ȎYx*Pd`NЅKO0p$D LBr6\z!(L rt o 0P5wKh!W˾ZW&n"KznJszF6G̀w7Yc9T^3,wOP1{k zJz{n Zob]_kqГŦJrZ_n|d9X8 iMq|"~x|DP=";\O\pE ʨYp9 Gw\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł\)czh{dөWCqWJ4,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\WTpeԏ W0?"PpkW`Wʚ Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b\WG> j `{Xp {w0W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\W,b Xpł+\A /Y^Ξ|tՃjvpݥf=6ޠrzL0sGsx40Xa}=lVar0ң+kz,pE~I?\1\aJsS^w0džp0ku8p0+'S WueGWdp`nY+ݧWd5 W!\)z:rNW?Ҵ+ bYZo-q0 x۴>‡Gg9z_] e 7C;Iڻ[+(9cJh\G)N=.֫WQo l f={_ٓU>jE+Ը \fE/ڄјf7'RMZ8|p }BnP!/l,4SOֽœoq.twœ#M?-_͞׫ޏwƯUﰔnH'Òn18̿xd8]?isp@GD hyAײoVuVX籯8`:]ջݰ -o󼼽7Un݅>$n}1+1yHߣ_fK.xubrըN2[NR`b:>: QiETM9nL:D, Q.Z:o: .]spG]o ^FQC!)zvV*vd"D}7Wg-Z]}ҿ ]l-:x>m } \Ew{xvo7(x(9-+=6߸z#^-iOB-!>ڴ9 P䀤\J܈/{比oe-<^P$U.Hg) ۚ{A(!SpE7F;710|.`S|mUvCJǒ7aB ~`&I|Iad \3#|4l%L;Tw}"lNpE-ߎ\-Em?8M`h%KVjPIJ{l36)E3 *C(0'upEhjKUFVj-a*U%u$!&>Ҍ< TN܁>sLԝ9[ŋ ^b.TtppjAVJ'e Q?Y,ᠿ9Xߥ#1{_1q 2tQ"ESWv|ؑFNėq N ͪnpEs e[Y,yyj_ l᣾[Fz8w*Փ/iZ7}[EatbYFp'jt5eu9.N_2@dtǓfpk=EZi:Y6,ZJbyL,PLǧUY,ٲ-~|W$سղjI+B |UVto oL:va3]cOOS)>~K =t~y^#n۔fŅ,_u;'1+!f':{l)2 \ت;2y V'[R„ٗk3=_;#[lu/piO ^H7ۡ'(ceP7.z+(mߨ,mW-B}vv׳~x_|p5/<5ߜLOy7{W/_m.r3x| *r~翾!}ٻ6r$W~貁툎ٝ8{_fvp$d%QMR-;7QKQD`b,2}@ü v1_?.Ubz_E'~7S.qz]o8O?}&?EJJW/lxz}gI/Oh4orq)t>.\2RHYt2 #IrYׯ_= #Bm4 9!G|ZGUmTBU*q/Hs/;h tg<:ou^VBhsXN?&vEBեJ4kӔ^TLh~:3$A^iwjv>6{YE"0Ԅy 4k}z/շtFmn;I7MSF̈́ .q\=S,}Q+3Sk\_rtFf颈%ɂ$CBdЂP s #!oGbJF؈)9ƤnJ(@D1 s },yZMȆ!YG$%eI5gEalhMnU6I˚/?e_weBKDv^QYBN1bGŻ*Pdg`EDAz#&`13uVܣwRM{={{);BFO8: BKko N&T*y0 @;i"#8Y1B.el%#p@8;YW5Go%.*C/s`3Yr6`$2H.Y&$ه.A|5U2gz;?AI%$ > Y먒O, 6;eX=ԪI=OZ(qIyL2Rk5,ĕ0&DJ@;h$.# GqTM͉ fdž#4 UdYYQwFdE 4Ц^툐lclJ&-zGTm| 5t AiLyc,$i0zh- t_Q=`쨢͐}*k|F4mEDYDiDC;$&l`+PAldb4WH+27[HTCoJZdOu>yr9hK9rW.oϼ}:kzRo۝y1,$ӵJچ[>߷˃+ۘ /j~uЅ|wIg| à[.Ǿ< P h6GbB$.3os6|~%эw3feX8,b~9y]]N2 ֢^$O6* I.PXŲ`ɜ499}5:V uz9)df(7T>\u/[jjaUtn x6:o3 L5*)))qFPS苙g&UOU7J4ZNH OY[n{C>.QfUy21HW*8^*L-⸷s1 z3楰Xs",<.`s#+@k*rbx |0L &Z.ՏO#A'Suo=y& d)l} 'Rٶ2drpT+wgܐkR\&(w@3yϥ]N`u٣)Q.yKpT!F T&>Q!ZzL9"V޶g9?5jQ[%|X)F&+daeB ω1t(RUzjs姷zkwK֡t=Uuҕ'R 5_H/*p+eVQCÜ ĭk5owsjիV+4Fe#505r۪"E "E!$dYRǹ,9ɖ5A,6f?00k;DUg;R$ ܰЯįP I&M 3xӚ+;.=?Mn.!3ALe9ICrNC8"Hiz'sEDZ|k/zRjx{]s4tֺݲYrZ`/O8Xl8M8( \FOx>R,^ъz;6 rk黷u{7s|tuf۬uKaf_NOǕJ/@b~@mI𡸻H뙐a.),i6bѬwcU2G\s>}U7endd *&%G] uݝX/py<KRb*&֖iUG\ IsJftkUjlS`3Aפ'R.t/ϷszsٰBV=Գ{xW4Y94y<-J6L4pmf{g'i)x A T,"h$}wJ.\3[` ʈnR!8.%r,%g2'D} TХs3j^;by pݮX<T~|&& ƍ;tH?"dgMvk9ϲU8[ft=[wzt1m\[.Ϋ|*F&:dͅ/@hsץvĠ1][FZ7Ժ}ҺZo|E~Gp9wixlH=nhC wĐv}uKdz@[ɔG WzC5Ԝ?U7$,<_;Z1.9|ŴƟq#Oէ ۋ(/ X+YhаL?,?{ƍu`|@Ga&6@M l|F*KFr{9F-ْ-[38c͌C\t3%b\s{:=䯴kTRϕuNSj"hFtT;ya/=d*MMm,ҝr.kyb/ b?=P,`SX~#n\E87c`LF.20͔f! 26'4*kUWY R FJ r,`$jQt  gbr:!W7wta|ŀ"Aq48gq2 G89Ch=ịUD1Zl* ;ۨQ1 DC05ЩCf=xopkuc0 k}{_Uïh<yD* *ǣKdzǜrଽm$\.%6!`BLB1jvROOR5lj{op\)7qt]Ckqd61ݫ3본5tN74JfoVԙ1 },%"9- 1 ZHbQ@@Kʠ7u>p0Z@ҩ1LxTFJb,G!SH' +nXTR[!{НuH1$Q,19O 7;x8Vp<3R4ϯHǢ#SBXt1r4]5>[0`q0 )REcߎC{kÕ[R+ngR|qIkct)t{~:B$\SL3"¸5\b`7q6f-=:C>4.|{`և)b)B7 Ec36kc@B#LiQwmHz:ݛ eE-6`M@ZX.Ra=ܴ3믡B HVc񱬝2~\ԝ[CSɬffLtv]ʰ,Dpv[=Y_W4 n#SeF Bb! ackL[4#|,~b2.\gkHB`.͗n?:ӺzQKE`j&/+}OksdcNrS6ds6nfXAG?(8 j0bEi1Wu^Mn(+7\}$e|\JzV?mLҪxJoj$A- &2<(nWŦo"򠟕AF=.DJQ&n`0' 9v<"b "Dp 0P‰t(p$TJrI+j3aVt?8CRTB ∉A yŁ=,!9@0"~yy_6Q^}./jfa[ ! Ƙcx4=^r@PQ3-9˜Rt&偯#RU#>7E5"*f8j.'sf!6 :ҌQ\cW`%js*Մϴ} %GͱGxJEɸۛ(/L?ib 5l [D(HcJA4?Sl~¼UI';njf0EJU.Ăwpf3.0g+ Iͫ>d݇2MK^-qcm4S  LdFidFrhNPMs0;omjXfyF^QeR+#Jmƌ6>fimw1QFeaT}Skb'6.vBe]l@pbL׈$Lp9 ]%tPjҕhiFt%ֵUB+P*ȖMO =3`LWtZaj7btEv+C#͟;=F6tqz.thU+@v96폮+ae`OtjYuZ򆮎(UEm BW -<]%6tutŰ&N gxa_tpšWwB+9:]JPCWGHW&Y s\uUJh:]%:JSR'uUB[}uPf(Jbn݁hCWmz̰FtkU UBKX*#+Ft`WWJڨc+$iػQW uV7tutӿWGuUB+ &QWHWX K];{7TUJ:B3Q'u9`q]*%tPRQҕVK2v+ Sa4u-[#--U=2MQVF&GԜ51jkV^>2iG٩e| z˥D`/\2ɢ #sF By~4i",EN>[j;AR쥊> :c`Btm&\Z=x -rih31p< 萐U\=x Z%mCWGHWZPEC,Qm*UY%Jzm;f[wKi7@Lj{tzhc@5+!e7ԅZNW % ]!]I]% _ > Fc+Llr|*wݼ|khFy2KY>+t:鍳qͰU]~L'^DfTXT`#)EN~8)mCw9bOt$o_ߞzlحUU <W|0NT.%65yeCp|( D+G.JN (j+-ֱ0=fzmqFVLՆoOZ穏Aw9iMi`S#~% 7`ŮЛq=l>u4bLϟg:b$tי>? ddgŕIү$xy4>Òa2Kʹdziʋu/\<8-zЯS |AlՕ6Os7}H"M79zL'Y>-P9&KFq?ZTn -7]Y$v}Yc7Xcih%CZaޖ6)Q7c*DI%:p."^b >w!<"T#,qDS&\wwǝ֤_ 3[9kz'qA OlbZeec5K=Hm2 #AzБ[J'z8$R!(ќGAcN9p^6ѴZJlB Oc*5 d/<kٞܯW2םEڄ2Z/uv滣, q>tSe%E3L,ѻcN(!xXN486qK$F 9=}$ ddp!sI놚ࡑF  DJNcRJJ37 zD0<K큕k, *)ӥuUНuH1$Q,19O 7;x8Vp|z׋ E1Øz H]j:~;R7c{8Wi,PLKuZċ $P+#dؔd{0}_uAJؒ(آUfwi|.𩇔0)i^MFfFFK>C??~[\v9hzؘZfeb^ԁ` I]8ɍ21T#n\fwjS]{/77-y}z2_U2Ԡ2SQB9a#{î*8ϻ.m97m'pg`=uȔh)˧x\PO7~<8:̻&}ZWסV2;ՌdJfM9:>TRK㨽I k!g/{4jcsMhc e1.TS+4m]4ͳu;o& o.%ь/&s>;:n֖^2V;ˌ|p ۇ+AȪ[9G:]5 [=LsTa.oHÚG U̒|8-twڛpVG=QW~Ϲ|<<+a&9>w$ xIh%ΥȍTDd)bw.Z}$$U.*- KC$Ec#dEDF[}MphGMQ (CۈPEL9:ׇ^-i{=㢄Og(z#B(Csq32(FUUQp4wE5'ūfS 8SaQF͟Sr~w;~ڮc'dg9]O;{%7-T_MuL{MS7:v8mٝb#ҋq՝(_k,wD.,ÂD2lx [^q;9wOoxѠNl= a8)ح{@_4Ɠ!ew!.m!`8(_Դ_YaԹ1;|fe@r\fC|IJ{PԀ c:CZϥ.Skn٩˼IQ1;97o G_&$vq);rh.r.nkR5E&tpMH z.+XT#TVTgz^\rް7CNΕ4>-йaL稃 x8"63UF60v \PzH+CwFp!PISP.(DN 쌅Eߔd&:pkpߣ2%hւ7#pW2nF&^%"")~6^]mDU@Q ?7L[+IxAPS A $#[قO1gyoָVӘc"Qepꑄ!𷍆+h*pEQi񤣱-14aG[1i_m=b#k\cB ~9 A[]%. >ݒ*}u&{ w,O "k ;Ǎ4ᣱᧀ_TDɨh:{k+x(\IڿIR奱J3Rm=Vj-1 R5n9%)ʵ8U(2$EM= yTJpB&S#ģ\0(9DPv.۲Bt {J5>xEEco4O'q-zVJ!90i3yb4A! eP.$(H!$\I# @ʦ TbwVTYCV Xq6y>LG00wHT)vX[ӹ@pVE@=[G%jNQn+:CrL-k(}cZDdץ2X,(j ,߶ZDnO-"Z!"e${D6'E DP69yb)n2^:|*( 3jr9$yesXPRx 7 4 [1m ̣|ie-76'|-*wW(QK Jg$5>3P#gJ̀F|U,I#I@ꄻ ՙ0(3Hs0VPGk;Z-AϪ,X;&eݤ6y G^@"ǟ`' 6 /*J)&mI"S Ad^ )rTNYŰI|sعrQEL`!ڃ2Rq,HlLLJ樔D@ӹ~ rK86k=rZ+ݩ/N%);tDi3 ]y){K/$)8/QRQ4~+c:pHkId*ڤx(,FF X9N7sԀ`UpDH\D"RF\LI\"B<b,9A]h0Okӹ!OErjDA yxOT|FTjU%5xBȦ8$#P (❢L9jKn Kbh9Km<~#uw{ya2;:"ҿZg 6$PA{\6) }W[-Y#d&g8aK?T2 4חPw$ED<C˱1ʇ +%zjMPY*#ЦA^]gf_mK>bLӗK+q3mbd2(1eN: @ف ƨQC#佼mI/H1T,iIdZ8Bf8x0%*Z1 |X;+q9 |8^#hx[[;Lq+fwc|hF$ 2\GzI],cٔ(A iPIY1vp<9Yz:8Ky0R8uDSLS" JQn-$ /uv8O.?v`g8s<{6x M<\pCyw^!;Dh"3*4$騾5෈kb -V ]*PԶfwOI`♀Wp-2eJ*J"('R1W*MI =.qqHƿ'gt5s>=xwwbԍOM`a/Vy΍'ENz.A %w֟Ȅ4ʆc燆SHᨎ<&ٻ8v78MFLgښH_{hS"qeb= KIx fuer Kꬪ/33V9EUtx Gl@sKgܸbxPuR qo¸%>Э' .!b4#h[P", >^I)>qxrz7-0&{;wPNB ?g韓G]tΡ<V|R`W=axo?WspoJǷO_׋˯Vl.wODųQuLߚvW~uL{kn> Mƣ̌߿q!0ГIGbů1< 8adͫ_6MކA76x|osr=6@ݐqtT5.N{zW{(jN:mf_sIs/ Wfx,=ՋRi3S\ RiEA>S%@լ+1X]|%',1ɛRkh׉0Z~\:!FY ~(=ڜcr^ث-):9Ts@?AMy 812M;p ^B./6/F+f{#m`jI-P*%Y0AV{zΠ_Ox'=&gTU]$kΗLƼwgf!co;th6qy\s`/,E 1K~XiE T"-@\#8cAREkB9"86v(Vhb]{<]3:j&`TZOSO "r\ .r|r)y@/qGNՎ&]g*y$vc{QJI\;Q=W Iæ[ЃqoT>2$mkV>ZS2+EiiP%b /*&mϤmv-i]9wCZ'1SLTi[[ p<.:! reyVW B*N S띱Vc &hnE:+5SB[:ړ',aK#(Jɕ:8i1FB' <BZp5e}?p~H@.(qCvYI(J1 ᭔HUsGêl3N#OK퍇"(|]_X'՗;͕ت*z԰&:'*PA8LF Vd:8  `8K`DGd YR>R.}Ԗx0TA3r#c6rV#c>[%fMPdBb#i1d/oV_W 4:I_7(M??9bsq9EÍ$RJ%',0LdtDkfa{)b~)xk*:T`gr8A܇Cf#g5VN}eqg1l23"[D\DYv|T#De=gM*nc*agH%zV!!Ɓ3uS/ gN% D'MFbu,3"f#gRҟ&~|:qɦ2jqťf5ނOh) I i$ Rm`Q ERf6pqW0wl:3p ")Nt AuEF\xGяrEG? quq\՚tL}HdhEKh0ť^!I8r{.GnH#"+Ҳ8\5ULoS c7 ca}ԂeΨU$ז4x.Y:9 3aStʩ߲xE@Ukտj'*5y| ӌޜFc[8oE0Cl?;}~BOvbgnyroG%I c2(PRX8 KI*5x[tjă ɑp!)ELpoN^z)ڵs† ]S/UO?֝w&w^W[ j<3>F5vW/OUm"E|n,Ӳa1Qu}WL"}(U{*C~%G:,WNp'H)J.54/ DcMgӝ;ØN:FO #kSkA(  LDbw72sL(QmGs+:Yi9RrN5"V_"D4Վm} w-ʦ%E,GeR?*oh$u1>ZKEee6f_VV{}{[}t^hTPBb\:Q3LUEf lBmNA)<'{T,_jonzr}^6swڻ.9&☼EKn4-7q'4߸ubG{#>bsZʄ2a^!X*1j ԰Q>G=g;Sy sG. D,?6G=bX?\n˯U9r0}dF˴ew1z3swY6Z/1MՕkm:Da^iGjd~rVk/;^sXeol,ʭ2uѮ.O(i+٧+.3>T91Eqi ٫^wϽYoAMm>̄? vk=Lw5]jdk]WuGYZ ~74V֤6r4)Q*nq](G} ,5/fdrHx*C ~*:cgH`k^*8ʝ7\ DŽ3@cĜE NVFZylY̔{)c~ywi5wG%[>X+Æ#H.fd"f6\Fhz HPcV[B^1w%)\Ӽ'N`ߣճWy[3mN슮'Zxý r)iƉr`!g5:2ோ3%ʖкSn{hE6-ͪæӄ[v^-+]yJR~Kk.w.Ƽϯu%tI@E™Q_HϚ;_E486NaW[ߑol*%5ݛ l@Bt_*%j]/a6!)-=aڿ9﹗ٳbt8XtUTѢUYD`a+r l]G؊J"=DX8R|JJBdo*KU]D@-\CRJa`BM䲽Dy*Q) Ӂ`bo PAkPNPl7;oey:#S/ŧ~\:s薋s0Of(zqwRR^CցS2X00ô uiU6TVxQf_Z 2KfR"J8QX#KrEcJVZ=y_=~VaR=gnEUFzhy%KxoAl1U>t] o7+Bp\^lEjdI$E$lIȲ5AdkC>z/9i kGMzѧe~Y=W4P)8 ~B$SYhun ~RҪjkAޛaG_*Ȼ!QTN 0GϽ0n:0d۹!$S7` dRӛaM?c6?dQ-bM+m/T>Q3t.U庫+jPˊԗhv8J ob.&cb#ZV@iIMlnXBF'CW&3h5m+Dɺ!HWFsRod d w%] ]Y-M 6=l}[fpY2theW!Jͷ~=tevzsF4UǧfͱW5kqZCDWPf3 ttoSC% VT&CWWT kF(5N&=rvBttutō&DW9:]!\AS+D+[OWRuJM6B:cPXBzmuЕLA|f*Bt(] ])|/%c+ ]!\1hiQR ҕ 5,d: 4"XD4]$]Y¤ V4c2 ]!ZuWRvUo\lXc# <ҦPm][(AXBte:tpU2thu Q #G7=JvBvtut)1L'DWRr<uhi;]J&::E %+lm2tp%1et(JRtJ ˄ *S+@MA@i4ժWTS ]  jY`,JWF%W\i" s9QLkez-?AVvҨ:eMtl}cf:8Q-7j93h E팋J9rC( ֳSqp8歲Dn/i{wށ-4sÿ/+P~] ~V1`9~?W`mz4 @,C}LQ(WisvٻVZEkK$W)a|x-kZ׺-,P?Yt~K4f!~:qI ^cR]D;Ǎc:ZJ]T#,JrkEa,Kc3Rl!&Tzޒp=rI#o4쀖{rlrMms/AϹ+7M\3\r#pd lw3cE̢gݪvl-<b1ךEn9%ϣWЗ9U9AJo.jj3rx&ɝƓї~ #x4q?ef3]WB2އ].05}=?4.,}CC^IGxD )wy](t 4( =[*Ϊh)U:*Vzg-Hxf( A*  dK9# ٙ V9YnbfڂTS*EX* ֳOb1poe%Fw 2]f(!- "J@(/V;|PPH"/ + :HHBSPw҅}ks<*2%/Іiī 0Vn4zDt{eVkSJ1LFA:+;vgݥkb\Q- 8o %tQX%<T(xЪkiXڟkhڥ#(rO6HB"QT:Ŭ_@#]$$E1uvMoCԅg!pDs!,[Б\Q|AE C5A^_Ӎ̢7ʗK >wFA\-L`͡K="Yot\"N"0I? 1w#0;}0޻}N*-὾Oɵ8{wItNя%Ψc}{QB9agcrVgn:Wl::-~Eв osPM&.9 ] ;ߛWl,4|v>kox.x٪诏tZëٲ@g0YeWjDm42F j!wdؕ8-,yDN: 6{M3}Ymb1hfT^yAr.%݅e,"|8a\YN2# xLk}՝ > $gAY^ICz:k @Ӟev+^uv+,d¿tյhpF逎(tNafOV.3PE||Le_)QxM;a~oǃPg_}럠g KEC%ʕ{tp.ҒG-Z.: I]:Ac[T oY"j(AlYY n}UMdG/,]ު5}4?aDR?~O?C/lpDfߚy'5Eܛ ˠ>y![c];#jsTlըO \&(џw; \HNYh*4!`L6"*Q!&YX/ Yzkp䞌)sp1wHaAgs2kIn~lcȼ26DN`,3㜴6)FF#Kd`Y%n|pš}rkոVũ\[$#<%"R!ɘ~dt4G(iEf:C:6Eюxں p!l :[[0/=Ҙjyυ7M*F [wF﫠?ʜ }egNTDt(Y͍d^mz~Ñz[@2:݂D-3h2m m킡yЭn'DϷ. 3ˆl*}.["x"s@:K RP> 1T08%׹8٤"Ykc  *TQh2n+]I(6+LG*Z2t)AI>C},/Wi1aWrKs֞fMjtݦ|+mh[&-S`{<3۵Lw'SV}b!+F dR@ʡB'I |D-tP6İ/w*-U;di|< o6bԁ x̑l8KXXjzzґd3Ɯ ZpAI&5i@vU/G =DQRB kɘ *cZwWT7),ʿ_-Q廖J\xXLܪ|xH%X x EFe91XwL;]591U˦Lc)8cRv,+3Tmd֝둱W$c[,ԕPXxXx֣Džb8هY\V]7& 3b+D,7` ((99 "8٥dFkt +P=@@MM2hI0APɶce9bfګs=b(]01wIǮQgWJ C F!4g.Ʋ+Y9RĴFaXR5 &[/<+m|%Y $+: 45 YԀ`Klf>dTWxetPJэ;c[D4#lmJbV&mH5c#G;䐙' $(Kr9l-gemHFB.8RVF͍@2"ɒi7\FnjݹO" u \K%_g5)me\=.x#ل:z猎Q-sd#H Ƥl]a5]e3G?RY&mv*eɜnQ6:>]K $ݪ/",V_,]1@`' " ˞ @ b96(ρl=8&ečbI$%ǘ/s@t9%ڔ QjݹAa.eIU2R_hy[$Gy4*7,qI&icP)(Έ,h`bւm국GV褥ޙ=JDd[㕕oPӠFs4UJ,5S dFn838ݻ+۩$d[ z<+k8tKҮbr@H8U@Bl@&fU,h 2dUh^#c0^uVJZ |T[I8+ᨺGt@Fd(TVd.e(+ uY"4;ptcW`\,๾w˵v^^;9ؑ,5|]gۓ L9B/ZZ|9CGTQOA' ؜R0J( dɐ0'-JU72 S3K<+uq!^fd/W0q:>'AF' @qwRܐmOFm8hpa'7oN3wۂIFs:``H{ S#Xd4Ҍhr1a EUwޞA6Ѱ!-xr{0/d᷽}@B I{{ad%!*3Erd 6zmp!,B_WتwZ_VE=Ws^nzQ]G꛶ցI !Lk^e)1bYĞ'Z"P) !(< 3 &rL yJ9$R.cJө~cdp?0rN:=#KyK.ӵRG9ngJMy3LyEHT_q4W)qĮx/?e-$ZjV %ԅW`x|{<-@Iߌߗ W'pի=~_%(S%40% yRi> ; aPa8?pӝ_`˿_7^僛 Op?ޛwz k<#8߿,k =.N.yjf6yd_E6Ǿv%5>r9+F ;]=J+CZX$z)|ԥdJ tZdo^{|ሎvw^7`7w^|wM&K(\eu^}[Hc(91׎{y=ժN]KN]8z#,š]$|Dk28WG\988m .gXuϻMqZuz_b8xWh&j8^zsō6Ʉ ̛9jB̑[;:qg_馞 ,Yeq7Q2vFeKBbHdqtbSns@ٚ%,ny5{ir^%WҔM0I*V6پ ,QՎM7rrgąW}a/i|vګ-`I6Wvk%IpET̡V,[{`vmk_nJ;}~[øm[_y{ekj*6g X^ܧyi5U?=O0DܘRӊpf r)VKظnz|'k'bf 0+0{evKa>Qg0D U~=h31魱 G/TMYb&{cx<?߰e|J~(YYLstj)^CTZIlSAprS` \ ]u1μKH.q=5d*s5&̾Imǀ@h\ܐ=xٕ>Q7z.=n1jeC1yqNVd9yS#rbMqf3UĐ6L%Cml՘`ƈdD0hŗPsW@fћ1̍U#gɤBU)#t#V895pdk,a,c›Ǹ7ǽ"QZF{)AK32(=7P=<*5M)Fl X.=ؿPMA[ΰ\QpEzf@9.6^D} bML9 V^ԜFviƑ cJǐ;bf(gkSV֦*JOiT2`Vwq~iqܶYאnڙskfCљ=?)"=s$Q`xTJqղ7ݖfؽNz1HakZ|>N<6JtE7oM %B[aHWk.ZN!B&fV^"vabV!."N_ؐzu]z|kcaweT-ҨLEe$G]'e[`"I1GEmY"{O٧1jydR))-  +&Hq;?V\Qsh{2gG{jZy.B1ɺ`W^ .bQ3P۵@>j9^ 3m- @Kr!Nhoאj%KRgczMsF㍂,TC+DVuwWX%:cSSmJe4󈎸z:d׫PS5HLJ_5Xu53*7nj\[uFfW]͢u +;lV똁eㅝumU}|' 풌=yOy,mwgX >+Nz_ayN7s=auhoVPV䧽w_mnͧ;ˏׇVnYׇ$;#s/8NW;Ɲ7Ozj)/?ͧ SpzeUmh8g{.& \+q(S)Ng2-_;b&!oh3+[^bRcN淃xô8:?<0wFxS:oY’RA\ǺbV]_xpd9.w[d:SZj jhVUZ0kd>:qzNSYB2)-V9FvUcfJzlYB_"\7L3opK͘#lPT t:x3ۑ"cùks+i%d$ =DF 좳URՉG5rgU9y2VhRSc`7ykDh jz=$,L5PUC1D{pͨMtwt/?^*# nK/Baf /s9X1~'!R7"аBܝؚJj=xOs!Dm:G5]NG*=DnaR9a;F7Ei(5vI[%i##[ ҋISFPKoCs ;K= L c+ YӍ4R2s9 "#~ k]0,!ĎцA2ԆR3uYFOhB<%z'S) O^-PTl@tԀ-B;U`CNM#EaV5P Q( ]NTp ꆣ<%, q Ygqahy`kpWgkUBbs4`|@PWgѰ֞QZm-C5 RhO Bԓ-eJX\N1jS؎q߂K 7J͓3@2g³A&4: Dg:s gJ`P.WEou@ )F+W@X~lB0X EbG%Tb3x+wp 4N[p 4s,TX,j`&؎.+A @gdʹ y#XGdI_ *AvU h*sD#(#M VUn֑u&yGh)(`~G[mpPSAHedWC0:<3=+()p}j 05`&*Cq@)l,@e+:  °V[RZvm-HҤi |F _.[K{2zi#c]Dpb`\TqS9'D_ *;L8Y Yf'Ͻ,yvk 3"]Ѽ~`{ȋ&xReYr[q;"8CW VfESml7 m ct6Z/kAI tBӧ:}n,E|q4|\H{*ͰbM=sOxtI!ގ˟8^s@ L(Ql +Ĭ|홻5a*HB=Y;/v,R~Vnj޲pRypXBUPDFR8fc0-,$fjrd,%&@?xP*Z#Vpw#rmk‚yXf& uFгL7! cK:QbeW{-^n]!, #l$h bbIxS6pVɏWou:'oV HN7>wceQUfƋ!_[#Ah$O 0,r}]v5vse0T @ qt[NBf Fo6 R8ɔf1xj05i$K9OSFhn0F~Q%lMŒؓtp` -J`_S|'Ѝ mUsxVD+y;D*x@):L%fDigOHRaփK *!+ErQvr5$r!$-V0 ..A 8F+mHxlFzd7H,}Ę*"#_H(@"rWUWU_OBD z l@]Vt.#Pc6ijђd0'21\+\OjW %X;DI< `h&y4u!‰!peR?ig06eX1cY<>d 6UB=^t`%x`WWv, `ZH`/*:ׅH~SЃE+}|Aq,ݩ jiT;YlD.56W >3MJCDx!ܭ6]b]O q18-P%LwJFi*k@5?Lw[Fu5hsf//EUJj~#8?<' ~I顦eAZJшٙxc&qHL k0!{Z LҒJL d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&ei@ꐘ@rs8L .a.L TzL ׷ L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@zL +*0 ȂwA-9E&#dB)2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@ ĩRD(WXуaAu5!ʵfߙ@#12@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d="&ͭZTO~Vf{}\mRv}ztw&gL!CWG?h L]LNOje,\Q'9'a;Q0~:MH/Dg:WÌaj^O{r8_T7]]{ nŸ6mh4.] 򢮘Q`gxGPi.;j:di*o{ݾ&M>;Q9+f?zh`~!9Ӗ38=TwdpFh^*+NTvCYM4H ͗YU¯ʼgB{IVQKv,8Z=XX.e0tyd BLř}n)azqʖ CX% S l%YF1ua+)R>@}+!KaP{o نjϋ4EOfjQɴSIGe% `S'UHR:cHk!Elj"QBJp͓WYXӖv{<qqxqtT4o}i >4lQJ7?1Ľ"&UEx>lP%F*" ֺqj{aVU-Gqj(CYQ\x/yh P@|md;z]ԿS~`w1$VR$PEUn6b%ǫOSt2[޽cswt;o>g-.ws[gWV(\fo,fX|0O^Iwg,߶"'Kg5\׊8Qk9p ˑXn"\;:W/UQPJ#ˏ!9 -uNמI[yPa7ޒu7 or.m\ɵU4ʵ+/]Bι5#fPBR7ϩ6;-NӬ*;_ߛMeg{o\ k] ڼ΃t/ .[2-kuv,nSI'Ig>[iX-)Ӭמݮb\^}nOG-;/tD鴨6B`ro&.y7'xZf`)x޼f#t;=y }saٕoxr;ԓ$h.ВxvS8?V؏cЏτǺujxS>|ŋ?Vg[+bYw 1JVSd9zVIo7@t4&%íYV2p%`vb . ZSp)c$r7|lb2_|ԖVђ(vbYu6 1L|o S}aqmI/0BϷ[WܐEY7>xXrtc|Yts!_\)%cJ4""ɠ Yȷ2ҢZU :[y܋m;=Kuj G}kb91P`h0@U҄}BƣVG3SMM $$:i$$ES̚a" (wT6<Wya}UF{HHD!`*9c ^K p$Wy|{J*iN6}WLe #3FK~*~ ~ԿL#.wG4_-׏m;+Ҩo`JW ,IEY9Qs g93,޴ L4 T)z3_nBQ@NF ]|/+ ? G`/Rߖݻ_Hr4 SIy$ZwʤU%l5ՍCzᴉgYR=jOiS'~b7ېcqCɷ.={v<<TRKQx6JEN_4ncKnv\qSH:AFs>)ԥ.9_pO_G݃lIe4\۾vַs"PH dUIKKryIa1|e1 bLQЃL>'=ٻ5ci/uUh7iü&cvp0UC_U#vLsjWd@'~n3".A7/7>N)e?_kuBR;UW5Bi5-a:7 x2omy5`fڻSӻ:qhEˈ ecwy,͋|WECymw7(W]6yE(W 1!0݀pCd_{98ZvŃ;J0"sa*I:$X)Z1mZT+i4OqX>cm ;kRB *JuOH.g #bS"|ʉ *%.fagF{JжBljC~]bۢиa]<)e(?oEYOC\pXߍ\=<|E1\͑/ƿNbO鲠s6O.]&OOO&҇Ir״'ϖ&6텻22}ta4|גnNҳ܎g2[@Q{ήExHr7 T0B=iΫh_wY]"|ʡ{@ȃ-:d?oe߷p4ں xUQ v=˳+hc5/\ck uN.&_u`8V&Kɧ ~QҞ&8_l銳5'k4u.e<⮬P1tq{7 %6$7SJ3䋻 hs+bݲ0UG\S0B7$S B,^o~!i^x_њ*%(Fɍ4hĝlAu}V\|p!ĝ2u +5z}t2| \d\Ie\2ygd'/lT#WyotAr`&Zw,O䱏n =8SVܕbrw 7#0&^UEbEhsKU/#HIm| 7L[++-KU2R;Xm1DEI(_SCH hF RSQTxclUjwm)=#.A[ #NG:>u +/&73wߏ?xǣ8^eC]ko#+~J|? Ý$0&LMF,Ȓ"vA[X~Hش-;=XR7EUwOHggVcTQ'T iR-ѽxV۴~U>@?49((FD2 L[;ues?gн}e rw>I_jz+L%:d^D(0(K\͂,BJ QzIJf}VȦt,s.fdwWs UڥTw܀\{I/ouC)΃Z3M9TL2'<qqY#_ E64MQir8A-w഑]ٮNFW'cWdCM3gE#kJ/91`,U$7X::5AmU(2JX, !dTd.]bdmRWfrs 6P+cx?]qۋmv7ݡ)cjJǤzGjTo 2GHuLH b404i Ȳha&Q5#/<982" z΍z'pX)U jG^v g=.2:O瞭~uU6Ui٦mXoWtY^.U,[(n9&{쬂 IRív !4.` ٙ &@j=J 9i˓"BS$aC`Őg<&sw.:M,Kdȉ |Hr d Z0LF8.)cjĕ빒hf1BkK-}{̝CVILW:!_>O06`)+M&z`(q.̍]K&$aK&s3W 2 "S *.eBqA!i t*u0l,!Jq@q!8}@ZըV :;1}[(D/eI%'Rt%M,b2!ER*S-Rb=XeLӘTYљL7V(T*9#xCVfC%+FҸC:m"BH86Bv"ɴL%Qm,Q-('K䉜MY~،#W4څ죏ևۤJƃsRL74@ҪΖ\t.I>+~#ђї耖kPꬋ!$y)8A pe]Kl|5%ˈ-)rP6# Y-&Pu7{d"Ig2:9vؑܲcGލGɟ Vq9B M-΄wuQGY$ ʺP$8+ B H,YIrJ[))%+9>YK*- 3"!k)JAq3UvrD8 Eӛ @Y.8!ݒmN-m4cs~dxopF[ i E);,2!OVD씷. 1hW3x_mrJne\bmg<}c`hoV6L}F[{X)gE2%=oBf,gQeJu]cˏqK(5>*/sM[ +BG^ f]CĈM@x<)ZiBcR!RJ.Qe}7ԏA?lfTvm2MK6#4u) ]\q(P_cML#̭%8p938%v0P޽"A 6"MxVYc#鬲L8I@Gg^rc* iÐB7#}gj\q.(tJL>mD4:,[bs75f4lY*[I%1dV{y!$9V(ȐtSxFQg-t2q"9"G! ĤBI(RNV%E3-YPwXy ]8(J=nap=1r<{lpbѲ@$0A@ e8E3Rp e%]մbaS5zZmG PmB})mRr?CJ|^49&~Zl^8{ead&5S\9*L99 B,H $3χ W{Nb1~_;4ߌ  #Nz?a 1w : nhr?Lo[=1:[pG e8+چ?A9ROѸX Oq8{?SzR14T^^{C&hHh q*mo?n;K'(l4St8ߴ4;/ ++}-)}Sa_'%y4nUگ'8G8_K/]NN׽*򀎋$n<%h\r?}{2)@旾5g9wyO_QҿR^(˸-Pxzp}I0[譸A-y<YV!~7;qp2-hyW-ܾ^/zj~#s9x͏ ~5yˏ\!*vMg/*kow x6_ǟ.T.x<;E?syQHqWow&x<(˯DH08ίn姓m\x6Х oPOs4+}O,}՗$;sYS+nyPnH KruN>'KJUSFc` t $q%#4HCL@4-~x.`ׅn0Zb/zYM`zf4&yg(_lC ] 9!qZI8oӜNLXeU57fJDEXN$ZͽS>/y+]5@,Jff Z}xSr7Րj;6i;]Xht>Lܴ8Cń]|ةгPҭuh5pgy٦VvW|FaF㜋%ڏ;(-91VzoDJP,Ti㮂xv;gtI1ec! L2IIf-Ƭ#H(utu@]ml)A秠J :]Œ,sOɷO"7ޠ`t6Og/e/YN4Xa jPqasޫA12x٥dFktJ'P=H6Ѩ`AE&ێ93UFj܎a4L+&殠vѱ+m:qA>Fd&+xOj1(Ț%dRUXiB&fYQ%IȢ(aN,Qj&xK/;]VFD!bFp=ĔJsiPt!12֬s%vYZ%W7{w1cƨ>eJ3!G+H,WQ*#b5q

jKS$vsxS4ݚS\e, T8++o}"ثv!?%G,$O08I|ɐwLNh u#DYNYQv΀$(σ#BkCD⡼I&.AhLAGLb|O4hd, D+4--%pRPJR;l>!z:Pa@uƲ cAwᎷ17펂,W0)NZƣYziUrDh߉E@yF,I$;UH:Gd|=.S@H%NN}]\W,t s0gP?l4ib}hBsU.qZym^ vsauڂչ5eys`%FylqnIsϘ' ȵ<̬Lb\WuS3Mm.¥F[Ϗ.SVXP7$ be"|H3"UyĶQoO? &^tbVt'k\o4 JŪ6ULlĺOO3ۖiP⧵d) ,Yi0*/ eB1Yar s#r ʞ2Υ :$Iu&AQu c6gN{34tb\)Tzr="Ah>WkU 瞻=RNj]8yO.&G8/p{˅*Fd萂haF-B5ܟiYeєVZZY iH5a;pέ-u@CJ\$$IR|lAcVż#,ES];zDY`~;՞EcMwd|Xx~K6EGᷚ]W'(Q7ږkqF.w.IX/AY[B5s &eJ,L nP3_$\\6Oʦʡ(L) IPc5~--M%r TGw28xqV^i{5Yf?ѠUIh?DݥrSd{gkX0q.@ 6ve?z&?--G"/s\ˇÐdQiiY:?xW!. YA"kZ_[ 6CЄ֏WHcx^ǓAYoV~yϯo6RϟC[bg[mFL%>TphN.חZZTUWO]PIEHr ,Jp$*㌩ `\'T ݢąEuvYZ\[{ĉ69=h*!. u+8>IEs&%sTJ S61>PJX Q8UFEL1Vz#7tѻRܜs!xם%,(C]s\`hu7OѧT\d4_2Y99vg\Fl;:Pv/Z}/A#V(t W)JXW+cu:GZK"W3J|}%,tH$jTȩR{S$1q:~$7 Ŵ|:xif?X3Jja2W@@-C%kl3"xo4& +䖛]ܓ 9/˹Ŀdn]?"Y8lLI A{6) }wV꩞*BO/Ǡ<5q8X -!w$EXy & 1ʇ ;$ k MCTn!BG!]6?3 ֎swVcf=nDxj࡝c1ࢤ>]5WZƸ)Q- IFEHQ)2ʭ%dnX Tae{[9< xjmRW!*5IsTQM+>mº[dw-VMCvꛜnTps! K%@<p.F500nv*:*+KrTDwQ.tCkJ~?{{7sG3AOw?Fva)#9?y'lHGV6՛ {!RH񤎳X>ٺ@;pd0ڛLL&QUvd/Seo?;"<7?!k>&u'~Qt<0[ ^Z,^p1hmy09'(%;CI_ygضCnG治św8$`ؾ^bŋCYEdHBVzY9#^R|0^N3ƟrΊxW_$G1ZN펰4FiBUAޙ㣄z/yۢ06ą"U6N`/&E,;_~Y~]?=ZUw?kDj{}7*gwlLWZf).t{o:OG_'Yi/(O?f5HzIp~E!{oіwmwi2GbD$} n!dk\&_rۺй/?95i/IݿN;i}(;7GeYNNNzXPH Њ#LJ|~A7&mĪ4)g-R|5 Vfrk/:roU\w7 ~er()muE3[lf+:1"\Mu8V_103t46,BC~!WFkGl[jCNN3Z22VxHν4"/E,'Ðɞpg6A BEBnGMLXVm #v1qGl?%-]L;Uڝ{ C{(CF* 8sAYfR#EgEN(|*I]FR =C*E& 3(gQxd$I5h a1qÚԷQKǺ #Cī4¹݇R)0>$k ާ)4kclk(!J 6)MyԂ3>`GI$2i&4w|87<98kǷi0.k#G_.6-M#C >ΣZ 4V#m@xfU_^㾸1ì> ykb4VґH ֦쌓khac_ܗuBva {$(jGO_;=(zg-F~TZH æR)AzKJ}>~7LߎcEf vם`f(cIad|r‹Cs)HB9.hz&*)a#brk!uSC))aEy8AJNȒڇՆ{sHIG$_N3|[z[)øJS'ixSQd= lBKDv//UG(ZZ3OGŻiͤ9C bfҘ,Gέƹ{W|vɽT9qwpP 49m{P1) ^<^Z!Ɖ;U-Ywh Kȥ d(YϪ gK>-#7jIxiСW91, IC4J AC4 hUhY& |VQoO0'ak,~REpT,M'Eb "p/IlK~ZՠVzҨ'wr `Y@,K*q-M YvRMf$>H;ǍƱWMNfCڳXMᪿ'AV̪dXtRXhЃY8tZ0It.VDvumɤE 71JHdW㕙/o `hhQ#39Z:mPa{c!Lh-'u&s>.+ U4s5 ٕ;De!UWkagjMV9ee.DDCY:$&/l0~-OhZ^8Z/@O٬?gJ'n &AN `Rm|r_?ʎc.Q4'LhBV!|2VPC uĤ3jI RA&ctQ KJ, }&sQx%;%f"s,@OՆe=u]y `7RP)e_N?L|4:4FoFt*xYϤNxg;2`ܫ#j4,_F5{%kuJ )]p+]g hcR}"Xht4-H5ogɌܷd\O2c"1 `VgPAD6;7Ƚ3iH7pWJk.>b\eYeC% VRAE~5\zG<K{;>/zLQ}6 Lk~^O'++%qc/1>\dV=xRAdIVwVW";djЭҎ2jhRJ#@Ɯ Y F5dw·wJ5,Yݔ`R( xTo~8="}2ki y-9o㯗c)Hp9tBAqi%r@drc m+5J;Y]uGtRQw}!ݕtSUq6K<ͦ4 SASZIi,ȗ+մ9abM~kZk0dV9$@LI)eO̵29HaqjC\ڇp,u}]>f5Sj]>`Sу6Ԯk {p4I>w]-|m20]O0G?SܗV>V}߇eW}/۔i8 \o]SsW$}+IeۭIV-$^gɲp#]P$.V ff-[D<(=&Ϲvw+sy}pfdFz`EdHj:Sb>*Q%T=,sR}!УC>J!Ũ6sAzOO' gKs\iH;|%bt kܗ?_vOwwRԼ8 GE)Ph.ʵOk )ndϭO[#~~Et>F{ yG1:|,e~KbX&# $Uަ!?<2q{B]u/kfOj ̞MwZp\%qrQtg+B=#"z"A3LL|Ӄ_W{9W!'O^^ܜNm ;wϊ: Tql_B{SxacҖ(DPBIjZ)ehSR涤9E )c}BZ-5}sGBr<ˀV @2%ٙ:pPhc o[*&_?#z!yb_o}YώGׂUB4[ !0VO1rb`;`K" p]Y6&b0HW)d^YqdWEfDR -<&rBtiu"9]fj)n Du{oLSmWp  GBnzhWLI:OB6Xw'MH c Mby״iOg/J(HLj5I 7|AoN?u]s9x?al>gx7xlnnkmBm9}xsƟфbx'}.o{;HO);> [ ВR qWdpR\*OSCov=G'u,ӊ⇦ׇ}/K\6HAF ಮ;lwvAq:j|.ř曛0?bGYN9='BeòC=:Jz W4Gv Jue; d\H5ciwawIj!N]\[eѠh+yYv0tCdv핣ej;ȃ(u> a2Bo[ ޼يKZ[KJ.hˑMJB &%~+ϟ?] ?tǤVKӝ\]TSȽ]9)G >{%[cE+9-|t7_dv@,[h(3.GP<`h<:wm'JF4'D!`<[/wRzB)=)S9IݘRI<-q,}b0xQ~BUӚe%:M2 .9Dp–.WV#eA`Y'n}((w}ͽ'*BentC)&Ș6 18Gm$Q2F#-|Lm:muad4ihU/`X2JˋI` dI|fUVmtNx +3~#HsZiTEFQ5:]\:'h@ u>_R=:=,ֵt_dO*(O^E0pN#D2T[#z }3.zѓ߬~,'z$AʤK`o]v2F2xs(`HRk.>JX:l&C6`ιE~5\k<~B S ێ(|( }3Lٻ6$W]#2`,+yY`]y8[Ng(bSWK=xFSwvkSͷ`l!29:h)jN-|YKIE6NQWtuJd' !>v)^݂& ]L˴4Y+MqPo_}Ѡ-,M. AZBx9 A:*0]"$v~ܓAn7CY}nQr2VCdYːT@ ݦYM9y1ch;0=ށ镝ΕϹ {<0,1#~ѱLj)Rfr^Ŝ# JT^dW?z}PDβry嚼Koz׎_G`MF8 z}?竁I4@C;m O:;J։R eiG* "ZsѩH^^^0,J4I +M*HrȢ gt& )S-d;GבR@fUVjNDA8pE>d *;-5 (.h#?ZkLa3FpNksmF'ǿw\[9Ҫ>is0gkO\\;O*|VԬϚiAo'WtJ 9ycag˟ ˪9$<1d⶝q]n5;x4A pV{+/iF(?x:G*.ՋkCRoO4#~~?y}|x|m)}!a8;wnuDQ;8|?{GO}O%2[:jF 7`3X踋Q횽ŽwG>hsv8_׹`[UVju_ɭZ=k+ְ4>?V͈??qY×Z/_6;?>OۛxSg{|{~z'{{o^>C40#K c z]ϴ~;㳣bvy~|x9wm5" [Ҳ?ˢyCQjkz{l@-~&=VhoٴtØ^(:5fg'?+ !F꯫z򧯺pNNPDlrv6C(},ʪt+MxK5މwDgeڹ=l{?NNTߪjgݠbF=YT#Hd5:[ga'[h7f&-VEw7(|Q;fyxU˸ja0Y^a:G'g`rj\sN*Rnkw(4T}jwAfoㅫ̆QLy0iMvhjIlb7_g/3\E>qѸ ^#{6NWǧ .h{.ٴ?xmo3 9yks1~vV>GD#JҦ@VVuK[GiR+X;4_&H>H<)-ZX-OgG?.mukY -TC"h*kK&#@&(Aڒ%RL$=kQBr*1k-bڊvo)V!IՉbckO>mCʖFitHYw3 jK!Uhw Cq&XE.IC\2 QRR(OvLisd*/|20N%c0$bЍ\kA`uIAiZCϏlVcrIJ[WR8*$eQ0,)5$LKc[M"kƪf`j ЍNYZIʔPsQXWm϶0f \B$.`:$%D+wa8Ф4IXPQɔ* FoDp 2:Z9TYٚ,0Jx< "ܠK]Nw ڇ=Ob]FmJ`ڱ(R4֧Dm϶ !er "UY:U%##( F[T" STBf"ܤuQ2147I@-r);6 VQb9kOeI*, v%d)1jD䒳"acV@jV YRBv.%Ɋa!_&| D2UCzC6"’K-OEEkp9+CۊGخU* BQRvwyECI\ D6| lNC;:j4hKҲ!-CH}'g V4 r[V4ZdRjJÇ Pʰ*RKEqAtkT9 ۦ&AS =!nAVor5E 0WKٴ\  l3hK^؎qYFa^5$d!f@57.9C d BX rLaǸhY;AM\"0e&kL!I0 yj%ݪ|7Ud&/6tdHLD#)# VSp4fI8+EzhgT/:U vJ )m:veRs.5udE\ll[QjB>Zu DѺvDIm"(Vc  eꊶ$DO;4r`W켵 F x΄`y-]i1.E͘U HN1ƈ)A; B LDNVJ PP6.>n:"K^J.G[-*``n6pLɮUcpE `$RDN+̫2 iI3 e0,,9Ѹq0ȋE%YZ{[x3 nEf6:aܨHp`*&ڥV`%g/L0)aaU0 vn☟G,q>ekJ6 \{#B|t)]L Ajx1* 7D. .B@1Bդ53mJ桔5Kp*|E0ZN:G!v,0 d!5(ɮ$!a6̨D$KF+]Mdy+ hAMGdaͱk7,,$fJ$)h ?A.j؃X0"Fp8.J^#,0" s,agDH! cȡhhͥZ 0+ 騚:a(X#i%rl[VjPJP-{)EFa-!Jw$d@0kAkkٲ&)\LKmDumgWrcfSv˴ bm$WhD0u fQ:ZhѣЙ٘0Ev"'4QG C1}r%RV . `LDyrh-GcT$QY.G6pwzBJ@R9$diOW\-FX7nkdشhlAT("($+!* < A"GooG. U {Pg{۸UnюËMn$(p[3֍,)Nzq%['H͈pxEl '!*ƈ "k!zZ\j:#cQmg! x$=sr-ݘ=H'bV"lNW= l$f|0ry ʤ ,NW+Lܻ8u[13cYqct%&0AhS9*&k 6LK@жS(fiYtEv`\X 6~AC3jフ Nm.X-1\hݥƖ&X6X=0rQ!M`xC>uŻ^bp -eZ\\:9/S ]%JtND4*Wj0=O*/ydmǁX܅J !jh|͛'%!k Iu0ZJ2߆ce[^/e~:LV?&RxH [*qI ˨H=AH @H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H=YHF#"Z< c!?3J+B)@:$BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $rI x<&10 Ęѐ@)F< IHqFI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BI $BIKުc";@0_KhH x$7r C @`71$@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@H! $@OZ/6^ٛAI5-OݦקyB !0&v@>Pes÷y$7e6!YI?yal֤tڇv?ɿw Wڙk\^\9ه\_"g->ןׇs<γg`8οTg>{=d^,m_rTnŚ>$5F*a Z$`( :0&˳`Mj@HtnAVv`,u>:s[ߝwp\w4uhVW'Y^kcIɝ}.([_CԃitkZalS_-ͅTj 3)v"awHPo䤚ڤX98NRxWA {Is-|Ýdc/3MdtCs=-\ښzƽz*2;Ng& fiD  DY)3*9kȖX9zƬ==7 ^l/9g5ׇlN`@?jfs*(N?e!>ot0MhKM_3kaz~)uU-˹HB K+|3l^:k"w*rrD7wYѼЫd.wVJFЄadVP Y(%/tiHf9lX6$(_)q7v@}iv`]&F𠙏{0%Rj7 (̱JT(x LNkiEd;D'm$t6YQ ĻT2A<>EWBAVZ]=O{׾CЩs`1rD^h z$W2qT5!␓4jJNz y3dI~Nbdb>jϟ{[7O{8s!.oӤ`vi`&sw]\H_LS9yv岸M3_lRx>V#N$^}߼v&:<<?-08`oNwA7=Q壍_SX+N` I*5sp3f9u6S{dulCc4ƜAޞU߮9` ~Nmd^si\&Ŷ$NUöWcj+Ęv,XVqͪiorYVFuj򬂔eM qZ,6qT8R+W%s@iUr<2J(Tx3+7]BotqϯxO?_}w?ëcyq`ك bD-Z'eAׄ!|tOR eYN*1&7eW Og;T*̋1ݹc~۽[64;4RxU UPߢjn.U.z;ԫY/+Jil:m~v7u6F:Mlڽf6JJP'n(Eˆ` JZEP)hL2YxIZYYMACs{K݊0 _W٢ț}>vZ,c&/n؉ !4(`E# )0MdUr',`Iin;MLGw1q@vt5nδи, ];39_#xt픝˴:sk> FVxP'MDa rʎ 0]фd5 Vh^D<ϒYYMe|¬&s-ZIfnٖ'cfS]f,HWmҶȍTDl)\tusvu֜>2DNLaRY:T{g'e?zWMS?M`8Z|WU3'W^:pê4h}Z_7Tk%}wU{앖U{G҇;{ڂ7cپwqy=u#u[ta2|ۚ./ҫVB-8k~]pGqxAo#, d̃3>=^jqz{J=g3턓xCUͻtߖr^6ۿy2ֺ.Kf}8q0oyگZٯ\kڛMfv*o7S7w*T]TڣU.|.dzD`C]1R7E')Řk2D_VTrT0 -;,a3 /(N"%M 0ؚsɵ坊*I>&KkoB^~}]5`V}[5'I;jp |䳪%0X@Vre\z3pp<;s[G"usAԧ̹ʸdl}JNI1B}. %f,˫C~.)웙AqWK0qZīL]$VdEݓW?Ow)}VZj^+}2e*0c^-©/L L$l3e8Ҏ(1eڧQ?@mk=ƬfBu[soŽ,>tV&֢GAF ˗ufՂX[-EaN&[Ί v)]1cdsp=#EWmC/ݘd]6vn} /DlTnG:\o ,fNE i?OU{,\g'iPMྲ>S_U s݋W?{7oóbiF]jITg796n{paqR@A]zW]9$ gHpaWo7Ӧanzog8+jZk^ej2*R;-RݏJ%(QR2jaFM@.I ONjf +Fγd+0mE. ݒnSѝdS9#olVZ\(MYlx9{ E'iV\TN+]"at\8 =7ur#]u#s4TA+Wϧ3w3] -VڼsOcs"SԅX_+ ѥ᠎I5HP'R"Q^i$f'yu*9L!_T['Iھe[Iڪ@%FDk| esN'" x9/Fw!v{bg19.sL>||H }VѸ~ވˏ>l C; .{kVNu퐳 1hQ.d9ȬMBpqAgE9-lɋf9KF= p8R |=<՚bSFvZdYr.ܲlzq[{I\m'LHX.TKS R5h)"x[4!фD̹ļIڂVS*0AJ"L{O, >rKh6ݚ]ءYsW@r. iy.lnvj1û.=X=99 ϠƞW`f!J *\_EcGa^N[Ī`I,(d6?d|8l()#%CdaStAt,Arc2 *TLhCA*)DJ a$ΓxKeLR9^ *.Esviv֜52>$bJtBLm7i+ԣ9jۥ`sAeQ@O'٨iRN?PC7<$p6rr$d•%3ɭO>8$GX)⎋@9RfeVR<rD9Ҝ,nک9׌85B{4aY0LҠsbII|Yޱu֜m-9.?V?HȞxG"c V!2 |PkQ'u1Ԍ"p7xD<}G 0L<N$>SdtDֹòXtjujI$>w9A$ Hf޵q$BS'c" ${<$З*kdHʊb!)ђHtS i63U_US]K"xgYVkim+-RdŝGpTPM'I`gtR f3HV(r Xz+Qzit#IEDHo}1j*Ʒ,mu``t%l0Y:Fm@2aLfl ܙ W4K2֤b_lku1z6- k RhHiCCϣHeBBlH&X!w_qF-xVzAR\ FiUJgK.O)>~#K :0u6AH",i>#1F)k,nLYQGyKBQeU^Ǡ73yI7X2,>p_VvyuSF~"T|p2D5i.gK0T"k3 .9g3IBAIt0<=`ڈlrv6CPCL,U謲L8I@'2ghcv*a̱C>&W -};%&Nht\};hi5f|KBza`yt~xƿ_Ġ~¸=M_M𗹥VOR^ݲ<?C{Ő=Z;o:*p4<3Gi3y[曽{'?GSSG19Ф*/t.'q49v<fy8Bo4k7}%ǓlFͿ =M C-_])}y4<%ܵ?i,_CN`?N;:;_ީ#8.l*xuA'GzI߆٤њoftZw֜}Eo}aʽ_2 ^FByMVZR,㦷@O*QJ׽gl93#4fZu0 ()MiiD|N X6jB{^w_o~:Z[:RWg[N[&D_[nռ}2,gxUlzrw?_(Ozb8?ᰯz-ÿ~]Nǃ>H΄&Q~>@iʡw8L.6tG\^N<dW_411Uy/*N.gXtCBtAZ8V9:[]I} 2iJ lt w|'2H*، 7erum\ˣuy_'˵YV0=y5_?"Qp6쁽>b7C5 =͋k4g˴LhX$A3|y4l8c.`c"bE"2TJ,Ӭ@Qw7 ,W{Sv2g M)s?hLX|a;GN@JQ#Ŏnܙ9ؼܧWv, 'KewIAZ 7L+7@njKhD] J(g9K,UzpW!3>+!wh6f\f8, .ig!s udښe+٫tc+Myc{Q*oƓڏW[HLڧg:}0ӄ<4 =Of\aӶ7xM1FEk(5L9VLLiܴ-ؾm|_n)KM̒ N*O%9"79$-(xq(2ppX6Q^ $1/82b\) qRy\bPu4We[M{nY{t2۳<}id"(e .XBDNgp=D:cFS*f= 4:I5/2xHZ:8!F2jW%=HOmPy&{F`kgj=RCo2k* !p N8T4I~H4'E)$<38g13Ӂe$Eփ}hCTB&sHFj܍J5,bg\@ 3^期l&hz1M _& A:{戭-+|s ڨ ^Ei)<_J^h^ںNt],QHQ`SFLT vyʃ KUFj܍q4+.澠v56jw v+ Fbd\GOI(A%B2ޓgf HdYQJ2*oBmlHiC&fyI% a1,S 6nYMuYkǶh+#CM/\ SF2\\̌5z0" e(pux3Vo ZIX.U SPr"ɓ`J"&Use_e>:iɶ*pōfE 9O7\ޚ"p̍2 ՞xu0@*ޤ\9F@A#Xva,䪖pX~g[QR8LIBGdQ{)+Nbi& AтR2R_W57mZG VSBfr=.w]3]`sJ oKlvQhf}dȣJ9&Ps7d_sڑÒ]t)D.YD綔峈΁<:i5QT)@{<Fg52Oѹ*>3eLZ+?d:4`!+* H3u0Hd Uݪj:o.ǗxC:^Xwt=ӥSkA' )1A}O%M)@ 7 EW,~V/<9SZVH,:yn:9 >\R$}Phzlh$6~_Q?aekW:&Lt'-φv iytM˸d]~ V 3Ut;tw}$ѻj7˅?8 ! SSMZIikt!m9K-0ǮGƮm~e;~~¯Љȭ|܅.N.Wbr'uzQ*E^#3zN<b&d>$]Ry$sϨ'`y.f@eL\ L } {KrREFI]rgWڳB!]ŝȧ,OTHY:TUMJ Q^ /f /ծ ݼI.tX5k` ᨇN19zj6Oơrm mu'kPg㱋p30Ԟ*^ R\6.y/)n! VaY؜[/@:5 ?wS5{70ջ"Q:p6oy‰ctC˄q@eJ7J2nR4Bb8*낡,Sr%{luBey4r饟*͝ \xw}pup&^?qTqIKڰre΅\+ddDG.5ν7hnޅTk׶غq8[:]=l:l"-un ﵞ3Qۥ/&qZpM# o"|/\5yŗ~ty?=:ORŭYVڞfayeV߼8Q(;}S5_|d,V>\ m-%,C/6)*%ڿh}6O_` Qx9suح"M­='ĽR֒U/I)d`)`12$L245*sԀ`^d_8bxLܳn]|Ƨ:':U%5xB #=~b4p§1EJQn-$ u%Gg?պ[O_{ {0^AZR3 D&2iN9*JIYzg+2Xa!ۄ)*06_*ـw'9[eJ*J"('R3OR@w+Hإ7.:N`0bi ܮ~ wjKwᇃS7zzqrrx=7 빥TO ?P\lk.~!lx_2&3s:ʑ6QgqQ68t:q78:Й{L*S88z'~8> v4Dy0@5? L&{y(h<ð5LOP-C(?4;. JzOQr <l[A_3T˛yHT~ja뷟O.٬/O"Pk-vFr'A(?~39~gW+N}Ϳaiy(Ʞ4FiBU@9OQBן- aP#]8%C[#Ҷ W/EۋT[Y^tss7FU7?z_ڪb{/>?„qhd:F*x8mwՠWߛl,ڥC_F?vnxW/?Y28N?f |_F[~iQ?ĈH0ow6(o>dk M>8йO?5i) ٷ=ݷhWߡ4oi~=\PH[kEX*nV[tdoޮM0%Imgׅn0ZvVFY$o{-lj<ϊeN qˮ_30$',$|f+>x;m`4-fEZdghS r8ldL{&s5Yd}}0ODŝuBυ,Dnh۵,c,1Õn?j*T*hn2i ޡ5a@e>3 '\9$qBN8cDfdף by[AVmCG$@&S <f\rcEĈ^kbBaC-& 5~=}bz{K@T_꒨| nVNSp~ҁW& l,od42! 3ϑ8KT=$WiUĄ rcEY&%4DA(%XLۑW)4c[,TPvX4,3˜r|iz j?{3Gl 9%NLN:ʹ^jt ZF#9ӈ~)zZ:$p"#{YMtP+0Pз#&&,RD68#sWPv j;V!=22#܆ ʕ#>GoɊpQxUJ=ddT8hĚ$(̠H!FctA;Ws;6N}eqg~l0";D\x'>dD$rL񑠳M})0HI\3eXC QZo RpjҔG-X8F@ЉDO ͝"0DaD,&?'m.uӒmqEbk32G@gV!(͘X4h ZD=dB{3p!pqWvl0 `/5*.~|G͌)ƦN0Jk) bUmJ:,(Yp@]Cn"w%oKYVǢzfnS+B_ txHr #a"(<1Gct2K{rJ\k!(*yesMKeLT{)u(2KJ/g@'Н9D4f I p~%|Ż,h/8N|R&IˊiO'}2dل|ysGyR\vs)%qyx+((d(&M W*rY;)f^L!)Rfe^y&\ cbNk꬈ddѢvHr\3Δ%9]̀F~U,2h&'vVL ;(jN ЉwD)i'2 JA ]D JsPoOirT^ @'I>Qdtx $\~Sq@>]GQ0w~R'-w :$fsEF("\h#21Q%gQPc8N%6)[6D"%_.o>."Q  E"%P[kHN8Y[m6:g3{ȶG ]A@C D)H7HT$j )3aQU_orY`Y,aZbL IYV߯zD- %JQ$ΰW5ppH hp]ɭ517Z蕵_WLĂ:#Z1!GNdB|\Z Kt"qy=EEq φrT^Ga3]l\M[xz@:Jҭx*HOSS77%2Rx/Jk ^ءAGsutT5 ڀ dFQHVkV1ʁ #m(,F uhUG>BDQ".ErF{SZ], Q&8Ʌ{aPФ"ZHtHE}A@RB)2FKIk,W2s?:PoЯVNX!|Z/8z8b>Z):\i9Bc3,Y58K2cI)u@;htE 梄m ΅ k%қ 2vkJ!3P%t,J1W M I܅PVK{?FΎe1$v>}z%SϷXme;/JvtPx2:Nz"1BrŅ'6u%{c]v-QbRQ+olYɶnlg]I8>mgP9hΆE,դA)8eA^ϼ)esVZmT7[TJ饛l 'w1l\~^/J#MOo_fKߗm]j^GL?&2/Ǜ;A6j}._kOkŶ5=*0A؂y QxN,$SИ6ކ,# @i^dBNs3 k[b;lEnkP~=r!ʮbs8}~u15?_4񧅤OJ^Sy2N U;OqkQ㪜M?8?F8(<̈Mj/Çz6|GQ~h3L`&&7ߖyrĵqpAA<5>鑶J֌0{ؽg].fWpq/c DVo* j?} f $//hv9BY?|%#B6:0+i|4r[NOi ,v]fwc5met5؟Z2GMi=qhҟ_mw4OzoF!i\͆PW)sYrCBq¿qKw%r+injjlɑ _*G7|w+݌n_fO93w[wpaϹuamo~7[a<moqu{wy:w F>g_gy^uɂ=&izK̒5_ܺk vO=OQv~]qC8$ОOW 8CAE=+R˘@FC] ueYKZJaR:堭1U@@}$'x.}*Z` `"ʛuQ!eyzݺ~(nm7zmB=#SG !jX];@6idGv((XrN^Bi7%y:Tw/[e2|9xa. T R-l/8f0&S ,(/3#@=@?.!=*+k?^%^2|V^{gUYBOHTJ^P`)v V֖zBk+m_WOCcAqI[&ę;DmU~>fKl 3 nCW}cՓUY|ɗ6:Aqf6 P(< CQ]V,e=Ծ& ﹳZ+K ɲ&r>+ed=?ωYymu;t;"e.;tlϿ!)#+W(L84 ,l1<j * ]##ҠxD"8:<"+4V$Zȹ_Вu4M{^|u=Ů,g,}zsq{ҟLWͳIeuLI 2V;B"*DF%L΄LC5!e(b4%dmEvRFBU)'iL6FyYy.kׄpn% Bw,VA>2֢T:3RΑL50P6 @[#gVs@XJ n_rxҮӳ:061^}&{jIV%T|!Qh@G^ (ZK(ғ&6~*:%nsNE¥B+MFj54Zc>M텷F'˭)j R1!gϓ!Ec) /X,'GF&KIQYQ a ٓ֠V:i+ 2@vBX $h@Lv}<:qa?+7u+T2"DO^\PH4 fh"*[ӱ׈6#sIpF`wI"I$'pZeJ &H Rj݄K9+)uTF8'#``B"P*+!-ޱӊ՘l5lQ\_Zx l< OEg.iŰҪ'e9c1Ic~bHMU&!IٿjX\i|.oVw3z-yÇJ>ZLHH82SKXht&<ɔ#x63gsMԖ)ɭaKrmH?}lÓp8 a>,wo1Bo^CswvO5'ћ7^GLx15(tMKo+NI5j8ޭ5n5qScvۋw7bTĜ/aONg{p 6 NpivzLgqj47= b9j}yppt8g],rӬ '嬓Džԉ5qC_L* >G7c4bˠx⟳^꟟;᧿ݻ]>G!wޒSݰ.͒׬L/'n^X;zc: s//B+?k̘N!6mޘ-kPܙԠښ[L2os a^+\y3rs8-np+6ҽ$bP߸ e@OZI%Ʌ\(RY @"1,8\X#={ڣs.IBF Y[vZ!{D^h$p,N;uk[<{>΋oe_۰ y'to`6)`\`h}Z`%>|<*'t i1])n C(Y27sxA]A9@k{4ɇl+m_! IKu^$;c`3$X`FP+D2$EwWS)QdK b{Gzai` 4%bѶ,5>ÇD=Z N0Ei|8 U$ܬ+}i}JUvԻyUç+T>dPա:>ϝ9Eoêu{ٗM5ꬢ3oQ^5W4qj<;E'|^RxU==mwt2Oo8{] kOw;]|ħhgot֎' "ȋ&LR5V}aXjA?e3Bp2p G>ir_w5l]"fxROotuQ5_Ug"`v)|C'_"R4 ִxD4d;ӕ;ZJ]gc NʄɵOYb(p!BDqgWmdvtdO"{⮰&U'\SBf:?lhǷ~hV}_ ϯF dr\G1Jn=J#η[޳r6؛nLbk^r}|#EJ2z\PzH+CwFph!PISO.(DN ,okC~S5HhL ܕʥ=W7#EZuX ?箮a#є6>PCx c~E h)e]jl!ܙ5~e l ښGFJhyBcJZ<4v5\L2WGzu}tpפI ;%L}oD 6d}1I^,f/ 383a0xF6*$8joӰt3kś5([nU9kR'{?5ZUSj5y%y5'Pjdxp쪬N~'+2gq'|s_"ΎW9ΪںjFu)&kg.;҅x4K0I&JM^XQQƊ3ݦ?^k}d⽼]-~\a8+yMLE!nAs RpQ"AdkkFc䒥omr;ʼnQ5'0DrlyKf_Mm[̐z%(#%Y42$He ¼Ak / @i 4z<{mR& m7ޒLu&S-KZQvF zK,Ʋz&ā\{&-&aCm'e;_\5"m}1 2 kdp'(c{=ԀJJ*J",N9*gR:a,ZHqq햳'\PY_89==W`/6{1-JBKIvq^ok d|"9\[LI ( 5JSDb9%vm$AHK'z~?_}S8l:~sz4ٿphe}>%)n}b$шh`8uБ*礲[@s+8qR$`BeW( 8GUS GgG;'n(]Lƾ#E*vxfX]wFUQnSrxS~s"F5vy-ϱb"k ;Ǎ4ᣱ6OͿ<QJOux7Ƒnx(B}>$(vS>:$n/Ylö H{C7wb3 'c:視9"gda"hQ.$(H!eND RN xRA#RTYCQ XqML $p sLqmhk:sZ xҦ__qm&7Qy]Q,g,14<2:@=jʡhkӹN+ֳRf݆c셏Y/Փ90V/(0 `M3IY1>^5tzƹUL3ǖa^N. `IԔb2k SLRLf=T <ƌ!t,A@!'Ƙ*sX r9vuG*r90yesX4K\,жK܏( =SS_:iZ{\vԋ4<]tRi$N繳QSPNN'k(&M W*¹qg>ދɂ"8>EʬLJS*g"ZΕ0F*r+"!20U>s$9gM8@О)MX2=W!$ &me>km:[PѡR$Z!1i'^2 JA .j., _ Or[ 0 T"D‰\IH  &Y3OGx,ZZatQb'=a}3:xa$1sMTrTrXAYVAgh&^I`o!."Q _,2DE4$gK~ő/ji|ӷІZ6q$j댩 Fmjt6LS|D>UGA1͎hJ@$sr#`FdcB:N3)RMv=ZQgy%!^NVl>qfNb}}J$2=S%{cAGτ<:)kǃ/$)8QZQ4~+cG=/8Z6)22JQ#Vb&I3 Q#1GN ZGTUQhH$H,QkeZHΔt% S) Βԅ+6Ao u@vw5Pn6znyQIo|jL3dȧ+F*1O( #=~b>ܢ/!\`y[a)z{%t5~y=" boX9kmoɾ1#K7{H`8lLICpKk)MI`&K><4Ϣx/o˸V@`,ܰ5꽓IXy sϣc4͕A2"C))%%tHPYgqw#Iٸ;Mʙ[~>kFR3Og/V"3mbd2(1eN: @Ł ƨQC#2-!;E"M"I!lp^ xUh G| ?^S#GZG_$z  򖎇z߭8LqfW5be E"6EI}hHeΦDJHLrΪ"x<<ѳQ$Ky0R8uDSLS" JQn-$ *v$O.\dO8Fn g/u3^AT n(\+MdҜrF%UF7[c0f -6 }z(j۲'$[KmAs+.FddDuБ*ʵ@s+Pld`l߄%ks1˙aʓi]oGW>2R ~J(ᐖur(1E5-ʞ$NOzD0^hiļ7C3ᤡxC@w=~ ЅIwn§y`qW=ן`499<}R Քau` G_dHͭǭ 8hmSh\Y>t~ػ85ɸ?<`c pP^D^?91,Ѓ0{ph?; 'ȍ߆ȉ %c? V_kkbt@<`8}9J^AUZn3ϣ!Qi~!Dݰ?h~I}H? WwQf}_xQ|f۠0)dRcK85)K "N^lQ7m?Ns%E6\FՊz4+MQaQ,pu|R|x9Mb% 93. ЪñqgK'^*"e_p{^ᨑWݫ7 ^^oUl>0jGv7 3]{םߺuahzCꭌxW& z>^d"C񏘜2οp[巯i.)|{?ypykA_gqEs\ :eW_49L{:d벓泿/ ;39*OE)ɲk>/TwR#͵"=hS <Zw*q-DwǁRڧ@AXWfݼMdt?[R۔3j25sZwcr5wD Y^?7wh6z=ik& w,I2@V74L>S)"灹\E Q_|aŃu'& OBطeݗY=3? s~٤ur,E 1K'ҊDZ`.F :iE ,՞o/(K /dz `VQN+a1T((Ǖ "[.En0KIWyV/3%|錸p}3z(ѷv砟4i'App4~~ j =NêC3 (}dVҠJJo U;}&hD/g%aI2 )7jGe`( ]tA3B +W B*"8,zg՘2b=6u45[Jje8ZG{=2=).L]p"*:\c!yJ{@'1BHKn}l\GD|0a#r1);o.q6(p28}\UlzYp:md:luֿaoZ&{_= jTe?gsp9WEU^ #Tp`0 &X"5q  sR@=F#P2z C)nGK%`< 5sf"8ܞ18{|X%/̦Bvluf;pa{wkg87(Q0WY!GM.c ĕ^2XrJ+K|; T=F%8pKY#U3TK`VaP^z!aFJyabʘ4șbR #gZgNއuIM `J;@(K)µzq88+.&.#˘&IFәm.ߏڿ"moCEKDufUI(p |Ȑ5D #vt#ڈhIVV)4FƐ:) *b)OR;mTzPg.S}2K@{`kLE{-+QpjG& @oPGjAtƲ c]czoh; ^TAE8,th) fi &i96 9'(Y߅OKeg?ao%~JƩ)vJQ&% ,q8ȨTJؘNJqTQ( lQtqc:eZ ј;i6qG5O 4Q52d/'']3qࣃO1bF՗eL{ ̛rpRIsɸB+pގRy@;=.FlR"N8D/QXgQ蘂zQ`hb*w!߇LZX繜G?NyT'!b8(0+ǃ5a(8d=Ѡ*z-Ц+^u-j.P7`Q}۱b-w}"ArW2i~ohX#RaP &%@^]jdIrjd슩3U]+9Kl ?=-fY,̡W|۔$=hԥޮhs(^X۝Y'aZ~WP  "/Ԗhb6L0&;6syzʲ5jH˻A=s@TSa^iGjd~sVk/;^sXe^ڂկőc*$_R4ˑf_x(8 b y+uAQ\p˶pt1lZOYVA8C8`0ɜp+ԯ=;kXb!'BZTK;[AY*@EbRqXtğ1T Œ9K˭.#)u-2u}Sb={瑞2X*8ʝ7\ \ŀ1@1gz  NVFa,:dUᬚV,P64 tur[Pg㓉KbXВ!K5.vR |񚜏hM`ƎpT,ecs19 Щ0|7a>ŇPl>S3MLEb QM@L1˺@(˪4݈j @8 lzks6_mch?nTZ]Ż뇄݅_Tb{x[ͨU%[⒖.kîqrhLΈ\M5N)0axjֱZк~Ժ^= 簺eͺ&!B[7v#]y5Z}FkkzMtf9{:^\|&S+T񛦛/_ L};$U}nv4WYgZEGş^ }ZΤ3ôA:V(Fðy<FxOě?OwIS k^ 42Eu 3IJWc p5VTﳼ_s[\yS?s&Xj1g: Gū?em61X7j]\-դu<}4\۝{㋍t=JO _IOF5GxsNƃ(5Huj5}L Jmd8e/Jf-p4lrfՅR1S6XzGkkag)|b|Td_o] |Sؒ h`3 k wÙD>!R cmz:iȾ![%ΤnmCMO# yZ$.H @k͝MZw]:Uv|QZt֥LǫAF!FMmfUoHNJL1IZkTd4\XKT1@BSΆk޳8HG wSZ޶nFεZsԺ"7˯_tmܹ8xGC{g{7ʭݫ #*F+sDAu ˔Ƀs48G9gh3ERJ9K"J2xi$$Ȏ9+e9s2&*cpf7|iv̞>\|xoDkE' )G~E<$@ͱy?r2"`Mw=fkaG!A+!F*y|HA< \/V^t1eN@|ciͣbD=kCdנ}h^輦$ypc0Ek(9HS6d$;3j{ uE\e8^Y/z8}|6+γ/+}(/>_Lx/,M-QA'eu.!"c8hK h.Zh?y/2ۜeʢb(Ӂxv Q`JY?ƺuC"I"48 9=WsӋ[4M\GOj:$$#d"lD̚ k⽔*Tޭ9)!x;O/=;g@G=:q0HêČ ߅c3:q:;lzYi/Jh!Ief4M'|ƛ笖tR4Y^~7{5>{OPOpOF?{^9;.˿/7_%Q@dFftgSNy7:H(d&Nm:/LBK"zOGߟ*/kΎ?2?|޿{ֳFnJzE}a+QPx~r'(mijѷb}h j3Jo҄UiYHu돦u{ oU7+=<l-˓}5\ՇO'+gzyo/6V&޿L3v-]w٠/^?a}w_qo]&?͏|5aRq?pS/C~gZ>HΌ /Ǔ7\n Ukܒnz{~ois s(9-jK?$m՟Y_z.鿨:ٜaҭ71 '-y3.W;\?-"A )q+RG5^G sT5fN܋;aXntэ]l6~2Ձw,Ҋzv=eA-7'cwuR̗j$% ;m([a|Eb웗nY]\4 ǮBSQIAʁ‚6NͲxKbֈ s+MWX_<Ckr1K6 &B&pIh <j.s3k妡Okn)Gvbϓ!7 ܇D d F4^p5E"`DQ `*bBkbq#U:^odfܲZ>>βӗ'J6r RIBq`p :T%Lh۳MtS'kpg-[@LAů2D"EQZ0b`j=8x>LqT9(fzЃt JVOoY'}lsw|?SU{G:%*Gav6X$ ;eEE.ffvsRT2*z۸Ң+h.C1fa):8l*at( R87#c;_6ӌ}6B3`"mJH3ޑ%^W43OW_]L'#qP9r*'s8#5i!]I{} 2W"s\Qh^ABU4_,A(KVG&* vHI{[H#v3qnF8=[!桠v38Զj(FȄ(* h>P9egEkJ۶FXRD#4G QtRk2A$HQ2Fs1Bkf܌}P.Rx0J?ED݀"^c0rA eIZ˜5INWbhi(Emxih%|N,gb 5QHv*8I7Ff܂#ȸ8_R싋q~+cȑc€ޒཱི)Y'ec$'3sHE]t1CiǾxn)[k׌.(Z[ ţD0-ۓZpx];:AcRY.2%C%+!ݱ\|wXZJqTmtϴOvU]ERڀ9+ %Ǽ*hBi ڡ`S謽o<lL9')pZEI#0*ef;^bv^!vsz LmW>J2N/.|4o׻ovE]w=zWճ苧"tR(8AZUR*^Y7J,XWB65E.$!٨RTlx؈EL5ḧ$cXp7S'/X(s[ep ċ("+#Z 4i$Sr¡j} >z9*x'+a /thad@;UFP|`dODH*U5q^GM/Sc et)޽;&<ܙ3פg=%ےCz6_'ztG y(;SC)E}hCv!d6^dΑ7"ښSQʐ)pdpQOu3׀b>Ĵ!敤9WBÞC ede%DSZjie)!ׄR\{ @CJ\$$IR|lNcVż ~>D2q1me;OHM7,RdI*ks,5ndp7Qml--w}$j ReϦV٠_S 0 ʑR:Js#,Eȸ/ [E:˩;9- YEa(WHf:-}>.{ͮwN>q ozB}8|.$vX>Y$]Rq sϨ'`yNf@9A((!j 嘥V'JgMG m9;m)Qܹx:z1DN^ed el/gXv} [<]ffeWHa$'h:*M+XBiA<vQ/xjsNgsìe@9OF.qRzx)Kd)mz6|wj~lhwᨘf ob^(pNb>roʸ.>Tg( 3N8uiQrجb2 cCIFu3u+F|8uPYi)Z"DggT|Q4rɥj'36Ug717Jsww5՗"ؾ>f38%7%-n.Ӳ[6Vˡ˴f{ѹԏŭߐ:+֮{/.gwo/?=7=:Uu@lݽm{w v N[h0R^}uc]<.Ȼ^.볯Pܽgy垍<+W|15׷|{=8<Σdq0,m+']WE^m( 4YF+t)熖[䬔0c ~Vg$3g+H=Kқd1[笔R \IiaRl2 RnJRYtJJq$"@OBN\9?%*XJ> :ޔU1V2E+j4H̝i,LAhtn6T82X)(P@q% $Qq_+p#8~nQwo]]&F𠙏{ᙣ$XJc^@C}h( :'=JT(xԪ3۔{?$:i5 tY8*px$ـNh\ R1 Rt P)9c"/%8+<|{l Ci&9~Ics)E-?- :TI*mv{qfb`~6eK Ly-AJR||SE=g?njT673Z\h1:pE+ GR ]G Gen^^xq89~TRKqT ;݅UG(̢k#4E]q*FK(UK}]n5|[~]y[]x5?x 4F3l̙S{r:cb y:A_bp{)Tbkr}Mga1|c5 bZ|{}z{֝l̸VFZwrS*ЋZـ%NqJE-3 u en)TuVX?\;~o^ǯ޼;޾Bs#07hB4aU=yzYY*st/3+ݬ c`p*Ma^Ħaò\ޡתwZꪹTm|wW&+RYv@!os#]F_œĽsI0"sBA{Y3t0.&EZ D @b4x68ԋKYIϟht<=BhP,0 sWH.g $ 2 !N3:]9 zwyv_vza2j]wuGqE,,u 7#SZy)3 1$mNC1Ҭy$Ee[ g0BM;$xEIR *"Qhm;@NPyi˵)I"i%bцA_ci`GqXIj#4@Tp)J#`6b7T+*mF}Ӈ͛>?s%[:(bj;2Ӈ/L4x\o'?]1FeAg/7G+ͯpPr~X)ໃ+e|?8p[rpP՟A(فq8 如/ R÷麒^v|,^Kvc'[A٫6XPB^g}zTE ]*kI3 QrpީJ@+5`k]-D͸S6!D…Ol73o~P;EVquRof u^) \l펩5G[fIx&sLP1vsr~O GKw&$q3P<.RRY坊ضIUkJ f&O Glo~9 |ӿ Ҫ餚5y+L䳪%ת8j> йa,(~t0#Ƭz|ޗKeȹ`8WR:=xᐲ!PISO.(DN l9_d&ߺpkp#4yk+[nFP&^%"")P~^ݶ1'" y3oYLLQ'䃖AtIF=lj_Z7.cD$1J/pQ!𿍆+Qh*pTEQi񤕱=̼oakbGlH5H[cZ3i!)\`TB !('a/5AfF}RpK(1IL8F|¯> tF1*vʕ1're*m!#A*3JFIl!Uɉ`UɌpyVBj+l  u_t5"olVZ\(M) r Hre2.%eKuI뤭MlBFcBbkeZikxwl zrcvvsK1wC9{0$(|cF0T+ W-yc g#qk|>GE iITKc xV{qھe[G*PkH 唤! *ODS CMu yTJpB&S#ģ\0(9DldaXsvyhpe>%jyވݮ:[mmHˡC,잕X N?z;, A!EIGF8";: Xi䅥2y$RTYå28,RL $p *>yTsn8qNq6,=YjoSo-|$fI4᭖riz6v2 `f! ū UJvët8~ձfVaԹI8x:0O_.#)(+)ǎS--Q!RiJ9N޵$B8 ~?r0Iv`fo1`M ֲRbU$ۢش-;`bIlv7UW]]%cyXrˇU,l2x_g/_,% y`g'ɖ =ŀ wN42RJ'=pNI5ۨ2\DE&Xy> r\ܬ׷{r勹 $(-kCˣY:((sVF#v i"5-̰(_jRjl1b|Sx|cP *]c]ՖbBlRgO;7K㏍Qbf7]h)p[ qɞsu̕B.whM.p#HTy+.p[SdVX "ґ\a5LYL4'8SL*Apa"LkϦ9-Ju%:DoUI iéAbj 4:Cȧqd#ڈhI ­R8ei4*0pK ($BBX1'V^)DڨD ZwWAp۩,#AkLE{-+Qpj`L'qL!Hm1NYBY<"aܮdvmOfֺ0M#]D@ltAQI6~JY ~J$'$GZZVRXi1T!gc01 vJ$(ǔYpQHc1) C܀dڂ:0C?X ͧW<9WwΰX/\| I%uP<ۺG}D2%0oq!y D\rsʭD$\( n+u1Rf#t*&ziR:+bDԋ#D v3UvW:Ku]Q?ƈiXnB3D^$a(&?d= *z-ƷXHǞh CRA(Ŝ(afuz%Yڳmˇyx^K2nBI4t xԪ۱5"g<6-w} 1}i.ϳ ,8 Su6( εB2TzO9ފSfO&jU.MvWå3v*]% m[qgZzcNemȱZʄ2a^!XWT1V"FMې3#-'SmD?.\mGa9{ YaC5 xP3Ery2naDR1]73s%j5_yuy/A8?ˋmi !-Pb5ʂW W2:? w3tny$XLjAuIW; VQR* jg>N0RFXp4kJYLk=83%V<1o 2`A=͉bwPɿ?p #9,s +/"u[lK}uqY(*iЀtq(2s,M##FGF ^{dR`B<U8e[ S띱Vc&)f&Z iIB[Άv{ j<$~@NQ'+k82tS:,EWLVL?!eݠg{7f痢 ,<`I'&7xz'8/ Nӵtt<6>`IP  3dbnƹXYXipt!U/V?GC?flXf>6 ~i* s>a|4zP*p‰@ne3QI@Kb%$ս(;HVBI5}.y{ γ{ |Q4rɅf'_aZ[e-Z}g{㮎]}3G^^Xblƣ|ۉU zn}p} rӐ𼘄4\sQ$SֆU/)ήmɵ]+R2"r>2֬:z@=ŽPkWP:yUz}0Jezv4 H3iݼu{E[jPr7W_]oX<&2f1zu5{'_ȳkE~[OvgtםmY1DM6Яa z0ЁyClL [[pΊwaAjzraQP-6AzlIV98! AQmMz 2%=d+ E.% : VBbM@Q;pzee:=NUlxO aY4OGễ>0pAR쥊Ʈ%o_]Hsy O?kKCyҧÙ21I В%8L[81$ٳ8XNS ȸE=٧꿝Jis՞+|e(b_*ݧ=c{XHFD0;BnE8b^|v dE-64Q %)VwWP0 =wT>GI`8ίCL6_K i8FHf#{PWD/^OF?Xcp²JAumdjhy k ,!a|R0y]xY*k4S^]y[^x9~x VtF̙~[cd Q6N`>08`5$HcKn鬩܌, {PDrJ[=^thw֛{8[%h}C6 29oHPHju8viW̒%xlXYcP)xMd⏪ޠqï޼zMˣ?}x{#Lѫ:z%:0us$](F'(=~~{9{EwR|1]0g1fU4 !7/~ۼiFjګmҴngA J\.ϝr Q!nZQ_k9E4]Xa0 H봳]ےǟ]ƻ;O#/ªua\w`n\wf:~-90.NI̙G9E.{{!wiVt[ܰ/?3N3FJ5(-#gqÍ*0H0Qc CJb #52(B;N12k;-l 42Poi3}Ny@)M*vM,7}~flFϲ$5IoSXz,wדtc&zVYvbQgY1'Agsg~FNa4STKu>v?fo[pYMwG~ϝ/-xՁ,lbدʞ|lQ;7g0`2|Ӗ.ҋ6 ˿K:%+_  (K`eLJeԐRV_ˎ{_Bl§dz7ZdD4㻿}Ȋ1Ъd:)&{ zD^5EH\nJ|,Su =:I/{10plA& "YQFl_%ٲ-YeO0ͮfYŧSkŊni쉖Ɣ=ARjmQJE&A@擥+& ,5jDhBX&_S^A ,+VVwʕHggrePA'Tm R-уxV:۴U~_@~isQP" e w>r{sQr;מYie.qFi'cR @9o`ERW]]Zs!(=`bY`>+nrdS: y9ae2ʫ+ժ}ڛi(ȍt7Sh7o!R{Wg0V0[$~A h} =Kt6Mf?jdsAfm\6+lPK8mdVwl ھc]#D`h{,r`@%g9DrJcQ#VEB 1JU:+̒LK%LJ 8%Fޖ+:|eX:;aNf<qOQ}Fnb-HkT15%p^f1{q{5C#:&$i1TȴJI@ٰdP9Jŝ9={@z'pTa T92VukZ8țB .=[oc_S7gD#{)nNDhl|vV VFqMB SA΅\΅WRsj=J 9i˓"BS$aC`Őg<.τsw.:M(K  |Hr  3Kfdd @Ug;]9/?#<5rӷ;djCߘ"k0cyp*U4*)}**06`-+M&z`(qLfm侑0?d2Lwd2%b|1 bhTq)H IhЩ%a6XT*1&%0PJ]y!(76hB-y=Y:ecJU1EK?/X!N(l2MC;TTO@i%O3P,Gpqcp23BF4&Nsins\xpipg!ѸG*a^gc9J^JF@*1eW3`R9"wa,Y /Z(v3&dee=Vliotz_GMT,3%FrƐF A !鵫d\_jT~*pRP`:5ʨbA(LrD ,ZaXjUZq6OzDgPm"X.w%&H YXd2Cyg8PO !*e̪djXtRX‡{NKVӍ~.uVoFV2=E|-vPFT&GQ Bf01[pɝ iqƚ8vT!wQY;8 z2-kAa_/4mfQ%B.bF_&Z1"C.QHmHNS~Z Z3T퉫C\uJB[G YMa|m~`"IgH:z"t$~k_ڣ Vk!p Z1 Kԁ=-8V3Iu9G 2ʑHIp1VKld3'*#nU&,fxp }pUZ A@fB S"k3UvpD6q2:@.8!S#[pa/4nʿY1T|S?95ޛ,ڣbqb =)}q_Ƥg.;孢 B3Ev+(v7Zp7ۛeoѻ?U?,"CIϛٌ, ]ϕC?}|,2x(^][ +BG^f]CM@x%<)ZiBZ0;Jɇl}nPy~ΩƗnzdieoE {u) ]Bq(P_cML#̭%ss&gp&J,Xҗm3tH&%g&Ubf6*@:,N>Qęmln"D0Y0W +=wD4zWĭ*q|܆fqǙ|YBYgNU4J;rd27cxFԙnb8xb>x@mGBa(!2tK57'tWˉN7(UhAgw5CC*Ig+HKB%+F(=B={fRb\FBHM&`kMhSJla}ߔQ]] {< dfK=\(#@^)ԁ*gA:H a  C7S;a+0ƿ~~ѿ`যH_&>\Sz'Fg, P_HK6f8;'iL-cQ`hi:w_(g+<|S* Nʞ~;aLa}<|b+wtrO/}Tt\4L'٨ty 8-AGGS~7'b6qp2N,I"4פ%42nz +<=I%>JiI0[譄A- y`:C@_5%88Dg4~Z}臇_.ꇯ>~o٣?:G?hQ/捷yy+.ŗg]L^7M |\ןR+[qݛa)/zϦ+-^~REAdztAg.qv^ߤ87?ex<(KR"K0MUdEWjS}MqpY"/D^;|dG\q!dy#;g#ZFBJ-jEPtH^Qrt)$;[u%:@7| p,Ր]Fl-#vܳ78L%2Տssz4^; 8LԨ= Nap涷"+͕,+TĴ>deS;])6"^ "Y󶥥4o]XH]1߿&խX7&_btףtMEwnvF7l4m&A!>9GVk~D-(Qn߽~ןYſ*ome.0;M7K^ .EG(1$xv!aJr+-NPG Hs\E&W!7Bvѳ{?s$Brp9A ;/Ӝ>pi64t\.[jlW. [d^-pO&u/Z|*[–chQj{oK 2i7]H,n:&nڊ(8F [w%JԘ<83s!rfq{Pp(hsl;)8J˼icbLzt̸( R&N,(սJ>óIOi. IhgIN2E,&n1fRmMRfTwb``ը|?pm3N}Q*7:2'"0[Vd@ 9+ Zz<8yN $}0MiD0X =LyroK;֥>K"[''|Crx$E Ӝjoi\8Y A:{[[VAڠ 6G i)<_J^h^gZ820PHd`wCbxHE,{_)Ȧm= kz~~NOC5BVBǨbS[,D ,&XTx[,{#~G|mwR{`\bdD&[FX(*$=}֒F!5`@ qXRD#-dF؊N˚L"YE XF!bHlT }w퍜agϳ,`T"%IĝDk+X1傐ڒ1 6&I9Zeh' -Q amL:U9&NlI+p:8 I=KYL"<||M",q'wr*& -M:DFr*3+8-jpY*cmMY#{6EM7zu:^ogi]Ip>ُr_{×ُǣv.q&j+O6IPlG4 L:۷eUku̷<٤%1K$[Uд-G sAE2mZ1QlEuP^RoS#9vC/P[z)-{zg7[S+7J2ŔSTJb-U1^YLeD">oLREFгy)lLڠ2@lā0jmfR;F1;/K"m^FΚ i5 і+ڰR2H-qWnErV!*^{AcU\c8g:.ɕ03RIuMF]P!Bz+*%Tc!XW: OPR J6',u* *Wu(!@A&ytc#(0#զr{ԭ C[Pp_ӰRĂ}3Oͬ,'tr*#Dd-P,sX 3Q`:f YۭJ"!ĚD<&͊$^$1P茡D1A`;Fꗑ6JX$V Ja# Ϋ92 &$a +ʘ5]8g`cۢvݾƨ"M<YI^5e4i:#$7ecΝzQeiH{,x0?6 _Q{N5K^5lE7- sGWU{OZSYͷOaxC08%gOuӃկZ%`ixk+o!Vdʑ,d0j0WsudF4u2gpZ77|ǗcNΛ*q\5g8-܍:1.HWϏ7}>fUq9M`TIyM-_7 48=?^o/~~yq/}~ux&X1x.%삄Lڧ8մ k.+ + s?'?̘I[<1??-KРǛԳ _c\|xk\./N c̀@O=N: ; u>NB P*tOZIr C.VF (`mX>a.\ڎ{N-/k+ !d*P(aˑ*l O$ D DP :uaokbdAg~U4r\]s])rR@k'\YV|S2Lzpq̃ (w<%ifGMKY N*Mk.\4xܴ/&a8:7tt6i5zz꿛ujxӑn+˻\(0ڳ{@Ӿ}va\: ˼Խ/_5T3?׼qYmG-yGM3~C]#P~{MFLm]~!p4l_3&?SA]/V'}i`vCgQiq|͈;"EPK#_ݱR'՘낓Rfiprvn?YY !bX!Ep\K曪JA+)>uBֺkhP;,??fJIqk(ǫonUt%=/߲ u'GBK\|)r Zie+ƶ(R-h[ QC^A0VZKBkZ)&Ul3_SnFzՉk4YLzMS NU4H I@Yc+Җd<"btW9.}w=T74|v+tr巫g1gW,0ӽ`h;$\ '>{HGf_9h]0 E늱--znΖRo[KfZoDK1r&@0(AISHFD&HK']Q:[ (-1f Eg5([VPP,ͶFf ,Ԟ޽&)Ys Έön$еL53 䫷[#gfa2h& FVS -\",$3- /MF%/>" y&BAQLT%Fg}||6MʪŹϖ<5[,).V;YEݲDq;t{7)~rzqу 4 SBn3؂TFbؙ;rgBn B JW=ś-M ,b2KVLF)&#̮rD)P'ƑA%,*uQJYb4AMj9~][*$lCM!,"iM>F/Wʩ:t@djh-S.ȹQVM;S_eV{RMATUA(slCg'WkWB%dk 5Xa,KY3Aԭ#iH'HYF`8Q+I(HLYDeЬ6尃ZA49I;;#xH %5w<:!K>3H0E2DC }!ۚB3W:j jPA!'VP$d%z颽GBE [0︭?V;m/z-ٔd7/m{?{ȭO tbE,{v/E;ɱϖ|)b!_ёP%@Ai,ف,>PA\ۃiGK#.aJq Nْ+V9]bUw}=B&|P&㈎^JhޢibpDf9aQo-R(C^!mhЕtȨ$A*g%XI_T$20g-U#hr6HEd9>#2.CZY* $J0V%Rl͆M7f6xŦvO14aíqIqxnLPK{[dОPBr(I8G{އ<+F|zT&ʻBooЬfYpmÓ;"8P=ߎJ`s~ya~0 ϥ&<F{fQ/::pr1Z?:ncxMsО:tDo()Md-ZHɤjyJƔ@ HcJc7SpT^"nGTvۃ._*na}}cג\2E嬠()5$mEDl\rΖMFpPe^`4 0Xm3eYL,>&_5(ӌ 锥P,FGGZG,8Z3uI?-ɧ=jw{m70"ڭns"R Ed 7lT!-E (V2bqJ?ŀ?Z v I5 ?aR #^xhEM}xYd$L !7lbLX3Fm`93l{h/,8$i|z>OS 9̻'Wa)va^yKq-z~yDZ!OCIK |R˽C=:G=#xqE75~~okܻ^reB׊~;-nB g x|/n{XB{}GV&*-}ߗӗFx^G7JmP5`s\..̷ґ/I4ƍpq}<ܒbHy)#՝tR;_|P?/OWaqr0_Ҷ[p%7ϋkW?R-tg׋#i]ag`b Ώ<eD+b?gwn6[~=Z߭Z5<좦uY'U lowm~ʻ걢RLܢiFkbaœeh8qYy}٧n2)B9;JF᭴M*3뒲 y4q/N,~ȽF/|7 $]4Ƭ(u.Ep RH@l2iv|U L;N.0)l_68@mGCg".jr0NšώJj%OBciXKUi$&G4ēxF)ðHFp^$Y( Hs 1zi pRy,R,[hk ƀp<>d@}EhM6id(d:)Q0JdrmbBFgDz;[Ld"+ * 9#b(\jp(Ї tW K骾O WOC39y,uv}r~1dJV.7," |q A/֢=}|tniw. QQR/ u`-afDaNZEͨ'xa7Kf2raŦj\|Ta+:`Ba#26n4$P[hBm#inc ?Vd<\^&.гHl'٠ fz~PҺ̜J # (CF*,IyoF@K&5%bx* Y ҪMm1Ya30e<oK"pv[8;Tv38mG="حoIŌOI [ zj!2ebض| X ):2#CiE'Mȶ&HVrbEl&y,ً7ΞdboW=WGr:\xWin7|cƫzx*W)TZH@3`2tɳlXdR+j cFj^+$]h s X\fy+->i)ƕ2-+"o*2;y_)t`YՇ(@oi`C QY,LU[ϧN+iG"\I xMEVޖ x'X `Jx2!FլgSs9P] ([zߟ_p0{!ǜGpYgWjp=zp+wN!Q;wN.wЏ޹E]';p,߄w$[ƂcAH0C&_ JF Tުf2qj诘o x t 鲂1nNl_C$uҷH.YHVu5ms+杴 C PQco3ԧ;R⧃>yY luD(h2Q%IXR# /-SG:'$= ( z_%)QWhQ:iFl{r=EJ}kXt玓']+eq1BA՗Ui` gdmR7=[ 48(e:Ntx't' @S)rEbr,dSK d67 *j;*x~*j? yYmk9Z~ΞZ5Z%9GNJ^H] %@lJVԕ쭵>E˸E]ӏtހ2CN+G+^Ɵ}Ϻlhm H1w;1g 88Sf?#U:Ŭobrsq}>8JfU<  i~}+KO\c>/QfMy+ -u˃8J6˻$vX,wޖŶ] P LZWQABDat(2o֢:rْrq>q :>jf) Q)zQE8r}ǒuFlZEokWs+18 l) ~?+o\2 “A R>[ɤl%fR!d2~/\jI~K.vHyrsN5[  &I rhkl j-m= oU㬆^[^ڰZ7i hHv1eՏV%;ј ,g- Z#iJg]転UɍMǂWGz X+$NXU$}6XK`xwKVXfޓ!g|ߗ~W#1Yw [M]h?Bzn `||.x j .h9 vOdF7'P sXcL2<~{g#ǑWɼ0!x/ !Ors{+S=# +UQodS}}_rHt{R 0R赘40pw凓KmVn|n/ *6fѐ_ Q{/8卼j7W׋jTflzH8͈eO_4D^..>b/f-Oo~z{29Mnzc..o)[[YrFZyK*نq8{P콏7m\>Ygهmz>mG&uͯS;*&E4in+籥RCuLY~KW1ķȇ2=e춘\^SwDiQfm0Y,ZԴpFj56ovDAFȬlε4IEُKݫL=S~ds.P?&NJgk@SG]B}HݲeثR_&6rW9>Kdϫ=hSѷ_m;FoH'H:C^.^,e(/޷Hٮ}//-RmU F̳tznDT;Y6?г7<ʊc6_?EXuׅ&! r9{TT o+m5[ΰσ8{;{>6YI'OW+U[O0k'(wBӐA,6flusʀB <)>!)P6'gg g-"IՍ^1rpcOEO ?_^o%||w]zskUjm|[w?kN6Sy)nOqLjDy0*m~f]|:3GkIM]^2 B&.tXsys⟽0o'%]%iQӅ76BOАf%SdIW=$V*U2(yCs ~XQ|IE )x$ LjyU(C Y0⸱ƣ4PH}h$kIu 5<o 7|tXTu~R% 9hx}PZbY\#mZA]֊N"]{tg. O&!z*4}4 ]{ #BK|u)}ʫo=ŤuԆHT—Pw@10 1Bk )4v(ʠv PJN%l-ѧ@hg]+y r0]BZx _3D[vRhE޲N+}Ƣ%ͱk?n,L$ fj2RAٕ P~ȃR!*q@ơ"2ΪU%CIeXYv wMϲ] ʉ6*bAn4C:iˢ9i$j,e4 %l@M>/,a* `V-6mzv3[͋ ]'jy|xL<8oۘ2I ԭGwH7ۭID3 .-zL{5o ΝDi ͚g)DzI(ΓDCkBm1&]Mw؊D4cҠNn B`|?$o^^ѭbp0) ,*1"&7cQGQ1a`Go'tXU0=JW6"R 1tFͩm X{uŒ+5ziQcM b %_=hn^& &ci@Zv!; Zx:-秝 v)v9j-*awhU@  '` ae+J ̀|dR\H O34%7aFD5>d8N֞SStކ`ݤ>jE#VnD 0r( ;ff5$ T$1DЅKOpin,X Q Aygjp!GTQ~ԋB>j4],9sSJ39jOxwŔsxxgX:b֊ zy=]ctuhׅ~`[NzoC.SYxXw k5]z=6,ְx{[9O 6>ʓR{2 `e \+zf+b+b+b+b+b+b+b+b+b+b+b+zT*O ^= dR}T?!p/b+b+b+b+b+b+b+b+b+b+b+b+zL ۹+ d+2W˧\N| wSWHb+b+b+b+b+b+b+b+b+b+b+b+A'\Qpsx2` \++ \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pWW ^M0xvsZKw=P maXs . orOdmsgblfK|]3L y}>{V}6,gy 3:ճ]t<>/6uOr^S;8w去6ד.S޷a^6u_}׊7n+~k.= %l[Bx_/RTEWH\j&{ \k_|ܘM4mnްhy٘^(?ֱ_S.Kk>vynRtsTL}N %e?g!=]oGWဋow~?l@]X'ξ'~3CrD !մ(y!aMwW]h< Y"]D3>eP2VqwY2Y{ɱP -9+R^*י:2˔sznF->uض|>q`d4QyIݻgDXux{ lJ^N:Jݏ }/.s{.?C/;y<.< ,bK:!<8]UKZ}[puʑdt{kPzꣷg7w\"ͼ-HgV Yi۳W?ކ?s9xPwΆ%WӴQr~kycWW|{ʚPLX 4*4!" Cm9;Zcf M麰%JL\?u/|G(wXʧ{d ,kb{L.Dï_V.Wyc?7`GGFˁmIFĭί`48jB~L+G79nVퟻGaS̈8e`кb؋ XNa`GH0-!8\t[E:`m++ΏūH 7 |MGx f "(]xPwo&#?yinr0$6Obo6}_ّ!+48]Hd28R`iQ29r]7n֥Ӟt)o^{@gSk /[& qz{)og,Hgٓ.vޏɷYݡHn=B|S6`ֱ^PkwԦ{Gu hWڀR9"[8:ZrQY 4G @:Cڒ*$]gxTvEyvU:9d$Tw4:D{̀6z<;rbYM{وg3{Y#I z6͋zU<^90KRkF%='+ks`ް>-O=DjAXz1sw 쏪ܤmCJSH ## 7"n]SӉ ˶hD^F[?Mo75RbIh>a h{G.?`DO?2*ya2LL[^7%TwtXLʻqQj"ӎ:KU,q t4DP [υZ}Lɒus[ 1'Nėj}hC9FOZV1Oì EoBjCl1o&?V%ѕX'_ՒκW20rj&:tpӆ~),M[7dc^Q[ɧv3ӋЖPrKBjR4WH80e,5pF*NQ+N(tt}(5"^si"I4׳ V2g86zL4O170), &u#ģb3UrͨgنQJY)EthX$D_c`,aj1!&X'3/@W*pR6˿'2`ccNfEpl@BEmLKmZ٠8[e?ҽY?^>59Bc" UO5Y3p<\e̮NcQ$:5+ ZIǜJ%GDED>( + G1 ۛGr_nSI,|2~KCcuvfi.l2nd#C|7OFs7 Ecd!%339cd r0qa-,9bu1X|,aD#|Zp~5ֆtn?d4 w^'7-i#3|fk?UP+5 Р&`"ĮӀ HaGR7{'qˊ #z!U1DꥦIyc Xy$:3:6踺ptJaPyݎ@g 7sۋn> /\4oHrﭏ~C+狗嚃'<(`9*#XJ0)%b4J8zpdaaR xὒ>xYdXLoDDQQ1"J,dH˒H{ n7q nH]u8D4yȵS+mؠ]GЮArGvC{`Z= /0I&Cp3nGBƀ wK7K.ARl٨!`2H1!gޙa0EHo=w7Uŝ}T$seԀ'p=3鞡fq LS%ʆKJ] w_=hˎaHM$5;@R3$Ox7w~xFUSe6:Fv.E.o*Dzz!,; Ğw"2M99"@#]pNTw@[poEV B%_54"W ]Cu-"|- Q$C3X.gFp<.>._jBچl8Մ|k8DA&KZ}vbCeDPӾ9j&`TZOSO "r\ .r@R{y^@'us68Le>@uF\xdo\xz-PJsOtnjh۹WWxͨ ιMZxJQXTBtP :Jǒ@1$JEo20 ]tA3B \0W ˕Y 0 x$ )p$ S띱Vc  ܊Ԃ+36G4Ɍ辌',aK#(Jɕ:8i1FB' 2BZڰ~W2$5Ѐ]QQ;o> ̂HB9UaJ}othP SaSIt&'=Jϧn&Mko+dܹVR*#B7C~l:ʻ7*RiQn*2e@0YJ,c4 %`"  ]GmbdF:H ƜF*e6ݖ1Vf IơPdGY@o2ސ款,ww7}6ů(h48/bsRk]F)&2:50ZMP5yNhd )3 j-$M* fLE) Ӂi]Ibgnm'c1OjgSڢ=*L`- 2L "T;G11Nh jٻ6$UJ/ݜY8B?%)R#~3REQ#ǀm]SU*nwY+Bp…궚OJ-dDT8h5HP A^ܛb]qǮvnT^y7Yl?: v"~DяoD? eͪjnuRu\b ńWPiگh_t[UZ hrsToP,j7W1|2+^CQO\"{(z+=qtX8Ozx(bQaM:qrvD2;M9,PFbƐP+#^*@KVsl=bly!VHK{oGLx%x(҆Fʘr&Ff.9wxʅ00Bň P"ڎs%8 M$"ʼnXg<T&Fࣶ4EKbCglԦ{ڹ+)).|sn>DDY.aR:\GҪzDzJCEsn*/M-lhyPOZv@Y2@4VBRzMX)A8V=SR'=g&QL&WfS% *]Wϋ#\5!ֲ-7JLu`)8WQV"䵥5Dd)cYy7U7Rt N쬘*dŻT02#;2fuݩ yHtN>q Wxi;}vV>[+ >xυrT'p@)c3(Zxp+Dp5 5u{!-*5WP`!4(kQD)YHAH[괋p"@+T@l饷eb2ge^Qgn3ֳ\V@]n"+m7ho Bwu9\x;|` |brZlO'GIh\arCKVE8 AJO/qtRрVJEXhB\ATŴ,blV@NŌ}Tu2*ޚ] 1? ̨gpfu!YJ*H Cz ɤ!cPQtd/uPr#߇luCdܳ(qBڰ^a,׹{A}ػO:}pA!ŠZY%.i캻*X+"76V ͚]^tx%ڭc%.uuY=NhzG6w'=8UMBn]j07\X=6< y+^zGNjɗGv\y yŗ<~t|ċ?$E*?Li{"O: aτ %4JrOH.g #bH^9d@ B 9-}nu`W< yv_=uXՇcJ ~jEs77?NQzRԫ=Tټ=1xۜgcڜ k7'ך@P:fRH՝oS7'[\>MۚTrMQ [sϧ#o/1~)S4gdv/rC-+ۍHB̈E_ň,)S$esKRD%Zu##c9'`}|l,@#|7`/~>֋W'-B\l<ܵJ>ku2.ۖ։-Z[fePygl|j⁜/.wY9~P{)=rcmB0w$ϮmJ4b.\rtat/:`rjj$ )}J)WX:^d!Blw:j4eBHEn3Uۜ:z&l_ ZFu#{cd5Ƃ?CSÊx\x:_6R H_9O571: <JKUU&J᷎y/0}}U{rߏd-WfRR}&H %|٤҃ѱZ@?)'#Mㄑf7߳POPO; =|ѓG|-j*WitpZW_*uu~vspZLYb65!NQxw+yGrh,մ|) rQ"kzɭ d}6\d+%" /E]=S,K\v!;߭K)ՀmѬ#-{e:]}"?~᛭z> ]zy8i̍Zro{13 #pa)(H5bSZ1LlM MdS`eJ$ҷdOI"ThlIFM6X\M-p^.}b)q8 )zO_wU)6yŞY~f&YYCzznNHOF[צL'KVo8;{ ϕB:rl}\n w^%@UK-B&+!`,L8K8 tWXS| kّTM2r͠j^fY=b@q W?K1[mn}7qSۦl-O cƾ`6Cn=^]:07^RH ^m*$<98O^Miql]j񛃗m=-ߓ)ۧ+DBf>;9jcZ!޼4]zx?mj\&/Ookr?N\R2h n,FQ}0|}p)%o7 Ȱ /nD[Kk$ ;?M .VgOMדM}..P|)aߖ\Ͳ vjŤ)KNraCg=@oSpN}̘f쭾bWq5G ~1ji o?Dq'O?~4i^Ӥ7mTB2ZsrNj ]&~bۅGKJ ;tۏ9M$(J-__^={_hOoM6rJܴV:>ãuawY`f.ܞ͠3}R(Wt}jFYӋ*y}] ңdus;xJ8qGȱ\EOy{#D1nvݦ\zݱC$;Zmvuv6970˙u'p0vdG\ 5g# @ nż?|8iwƟϬ/3S3O~;CnuͿ]SSDZǢDNboG*l}NɌkm}TU "[OΌiy kR|n({@tlp1-^?^{X1sbZlD:3d[ Rvƌc>L*UXqlSc="c.  { ǘw&>#FcSHV@ =Z}Gd~d~鑖M QuE(F4fDɔU|̙G2ŸBJ(y7gEë U-zs!m:8: d:cK :|_MqxA]&PKq!ɒ.Qwv'-- BM mٙn|3AmNb s%r qZ,  ØH@RZf=a=g3vUC(X@CU[b bT+eNX|/baIa;Y. 5ZF&yxdRDdFh}4rZR% }R,*`W"mef@57J 4[ vIaSX,_&\"00 LahAa GFya_ `R@AVZ h\Gsuf1)n7+,X L!e 0#}`AH#Y+C0q6t ^uQٶRG=#/KgS~,EyH! %y.p rV3u5c0 haaEdے*-hg*jf<fmvt2%U"@k!Z$OvͲ+uPgFMkdratmPRVMC ,"ln21ߢt~$  W<"ّ%B036*#AhQ +Wr'NJմަ-U%@M8Ro] F`߭dɣˠbҠV-njL j`4h*DyqlHBŒ%n hQD+N 1( dWRF=xbc = X߲YoaATO6 "8$o6)n<9@gÿ /CWLyP87hQH87AFR<C^uK h<6&RL19)Hs= L79}7~y`?̔!,D ymlvvĭ7/\8kϮ[V1gg{xoW'o[MᲐ⑭;CgLx8wWݼ$I,~lKfի^ׯ ^A1W*E\B pJsX4W_"p +B +B +B +B +B +B +B +B +B +B +B +BQ+8\Rp*y4XkW{Hʂ׉W\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!pW\!p͎1WlRs<W`\ݭ/T$A +B +B +B +B +B +B +B +B +B +B +B +zL J-W`}irJb|fWUXPZ-Eaټe;/$pAϨߍ@}T64"yИex.V0F`ד.̍󆹆 .^{u/1Ʉ[N&7|!7m)AWpVG2o'wD1!Zl6Zv9e2Ek.%I촁KyP++ɰL̡ )Iaٯ-!ưKBug:[.(D)`eݐTu]<ʳEe\$"e{è54PT[qRG@PWoXmyp'\xJo΍5/ |wY2\z_gC7 ~J ٧;@|%Βy7ٺ>˿Dbkb_9mHIbKpK˪LiJ<+gw:dNfzd#ja䷽܀R xvllQ3<;V-G;'w+Gw*#Xk娬˵|#{wRs%w꧊VQǶۚi;Ski즶R.=:tͺS9򁕠5mLrI(UNV^v겚vjNѸBάhY=HJ,~1=!_[㧻Kκ \nJƤήQ'aKb+EuؙJ&[ 0-^~{/~F^Ct_19l z 'L|7_1I1̾ub2;|k>} `tE)mYNKNqzjS>AT>ҠI2%e%6%qI8qUFTَ'M3]Y>ON^T'6'.K/%uWlh!g'3t:͜5 ^X>s/:0s_uF0wot; VL )~ź|x=`d\kЁE'/z-Py^Ů= VkhqЅxcŢ ..XZ*Ԟf|мvYklSê?ŴZbnb(OF|R#j"kmv^\w1kܸ]!&fKzwf-d9љYQ9m%}Gɭ 'Y!v;$螩̊5F㫸x V]ue+x,+pzSnS {N7erqi8.we\|p!<5];KmV\~vy:ΆNc ط)2rEbukgEթ޼E~: Hi꒫^LY[&4VFn4mԇ;֕yITn)+6pw wwݛIt~޹iSm:A;io>M{''>4EKwn~CzӘE4>&~OY1nϣX~U&N|}]sV٭_`{ʛl=yh),pη-*kV'7蔮9?ک7釟ウ*sB:tUsl4tMhB\3o5z㝟CCo`ݼwH$gT@qD"mRNZ;~pn3VZdmz~Tja^Hony޽}jʵ}CUk#ƛ$Vv =݌ q*.4sn_'{G7s|u0K{XB1oC%׳ |Ҿ零[gWy꛻ ~{wͷib+ucjHǠ\/IZ=Q] tF^ B1|I*ɭ5Zu}2+/ë%5w/ك>4֘7V۲bGDR**n{f*C5})B{t` UݽV?Ð٭و# YJQR=Dhj^`;ۨt]8,E'ZPV^ Gr+I+I%, (öуv?do瞤hy&Pˆwzʁ)U~r19խm ŭsKMgvВAKU-Ƃ#虃-vmث6z SiJUk^14[k%zDDG0Iߢu t&0>Ѐv:IUHy߱]FɅ5Hq’K[cKkK ёZb4 /,r)I^XTUDA**!^Fy[UmEؚ9+Z. +?|<$(DVnjbׅ-퀃U5ttUsj){GkRo`pG`K"+%BR)1lIQD)!`oL'yǴ:*VzgVxfRt&s*yZs^>q Y\;p&gO> ,?gmإ F;:RA+Vx~Or]iǃ[au'2PR-M)(e]U8NQ18zΜ21i UrJ*i0U J3UE,Udm7!Pc )328[FUU {kܯߧg׃^*͓C/oC>g O|/ঀi,O I3׫% `\ 5Xe ZВT6H <' IYi@k̈́ݯOqgR$ȩgSLOt0&FísVD)R*ڪ$\&.F \h8+cǹ*D_T)XD*5s|3MFl畽,j%_’5];^Yl8*_?ܲf3Or'3^xR2$" >%jǜ \9Hb)UɰdA3`)̗:|PdZgۅcZE mjCJ 𳌦TÈBT\޿ u<*+؋MP>8cvktM^*x^9J4 8Xọ0*i| eZa_S-M $EIpFXIH$J5BtE 'y0.UV^ cRiAS'B+!,[Б\Qy{62pHQsIZ}9rΙ ^7j.~=4;JJVuhQM:ЄfŤci9pE}180_irm){o?fWKڼJ8-/}Fy8! ;K8$OhǓ&صHtzVH)W7 B:./r/-Omguc߅sz"oŋETxQI-D>],UE66/z6N`%$/R9B74hn5S}}d|5lY0F30}Rsy-] @W4׼ٕ+{/R+sRlmNBz]6l}6fb_Qxy^㬢G<;םY6WF6:+\הUBsH~]Xj2+9|JOhoBw1HQ!Q}&N\ _~|}?6ſW߾\`y50>ckŚw)&ԞF'd?M>tF<_2]s|15Ӯ0 [~ĺae\>;Y!Y܈mmzZ[YE^wwz1ub?!қ/̭FIQ I0"sBDYURKrq&ƅhb\k DX IM*~ҧ8&=b m|s+!+K?#S#IC҃!19-N;uFq<<='fXCk}FliC~[>O#Boi3sMi楈(z31$71g=m"6୰)V+EIRT d)hw.E>I J<;+ i˵>PBI=TgEPq106ٖ9-<8$XmuODLQ"dm*C76뢕3=gCo"[~kkFWE?L` b^_e4C4hTߏ]?}b4˂H󯿞obԍqPs/>C9|S,zwmm$m cwW_ rs<Ƣ/UbYҊރTI]IKdʞ[$gXӜ,ϛ??wW*{~vr\ܚaa.Ko~Lrorix+=oy)WFq+ W?½6;i fd`#+ Xsqh:6;:̶ľcyՖ}^m񰜜frHC<Ѧ[( ."GlY*|:^u;>Zs} ެ b7;EПǮQH>wLI B-1#nsRiܸ9X _y*P0$vwr\?p/9RGnD<Z@ uN/?_ yӿu6?>g.Gi> `|q)Po叵&ƎDi+0Vbc5PELՙ|Ht)8œ)$.&tZSœ9- 8K- z/Δ՜c}*oU"nPbE*{֩h"5DMpJJC.Z|g2 #%NOYT1:)(C k߹%} qkuى/btnwoAĢtqdXe5-4a۶8}}Ǎ5lzGlȴ5uƂ=_;O~=YC:A0XJfW]J3=G{{D,Nm(L®E@+*ٱ[#(\d. h,~VUT#0uFs x ԘTNVһGDl'VgWbҿWioݱl8I_MM߇oFN1z~:3@iȥfyE*-Wf(P2%6v1%{Fv/b{d>e3K5}b[o'+7 2Jgt$Ubesr;zYnG VThjuWj)Ws6݆k]d n6srlmF3 `UMkAs)+Sshn3NJD~Tye'qhz:cۜ-̯ Z}/{e.:xP+A{lcUB 3 ./:Dۤ.D *kDc4ڀv+&LV0T~4ֻb?zF wޞq>@91N|7^Pc7VUe$ ao6)9h2Gq?߬S0O nN [+n3ˣz7 Wͪ=$>V.HƱ!OÛiI!~F$vͯ$NOyD~Gv @Qz{@ Ϸ?} 1%ǸT|%,F RٲG ҷr]&Q&eӃFuS]%l,0g;rS?bvd9_qHOIGp  NT?U 7f.3a}dDKۥi-2|tWK4X}/O|!!ҧ7ܠY[Nlw09fKj棈h/\h//8ݺwccZwvͲMʹx-1Biw:`@Ay_)ۏ3 x:tYVXP:mu簛pV9J[ g0Wpgm QR(. UTe%PɴrO~KzcY`> I7ث ut&$OK Ru6<{RK~4Miw`׼鏟Nş!Eo:]OGǧ/y諏1rNl( S2O g5CK}(`Kb1 X U$' w(Zycc*]PgJxdCP!" &גrZ pIP']w&݆c࣐}@kuL A1#0RdSJ1Q\":R3O}e<`O/`UEY8Ղ5#P:jl7TC tSX!ˡx{_Ǝ /n?wXE'/ǝ&Z[7FI=_Ї_MsOTޡAS+ ;ebh )@6zRTsUKb^P,juVЬlR4Ikg HFnَJ7,bngԘeq7y{Eya ѓ )FxH:(>BS@3Y>}kA'\|\<<a쌇q?<|ےHȭb=.(`ǯ(ZC8$lUC+w2Au|4Og+/AC s,cQ;=3ө {:iP-XLɐO5@"9icKEOY9n$FhRsьJEvql[-Rދ݆C`!;]r2 {z?uRߚ=ΗIVKΩG$JydӢ+4K]r0"x +=rYfUʑ% \5D;Klxu*8R v'`GEIۂQ[_ pkI8꼙JtnF0U#J-b ;LIyj 8c$3u9r((c攋QH^zշKW@ҎZT קWSUTSjNF c5MT*'4I{id4DCl})TV bUzA;zV2}VȤ'UؤNm(zn^{΀wڮ`,(LXZֶ%Ő]*iɃ>n*ƧƷy(P\PyrʥR*Wd%3B4?-w4k2vP4d_"yXK:# )5>ztc$H$Z* a 6J&w6Y="]]Bc;)#ky)Z Ґu68֚}j<E~<+%."Sl љP;*8v Z/XY i*7tnEYnTN ΂*9U[8]X Rw^١ =@:J]V*i*.dNJ̸~t˽aY%#/-Rx#yvm'vQ/DZOo|IïB.լ(dA{)2DӪZQvQu.@]l`3M6JtJcʻP!!$_Էsq ع#[-ccoVjH !ge g;; 7B{祩g+zj0Q|H>&u(fTFN1Bs8?I4fN:P@ E7XԐQ1AcD@'z0a6Xc Ls !} /YkHMߚ`Ed2X@rĦb4ZޕucٿRVWH?7t>ޗ>q6?~zQ{c!۫Ep<7'q/q mx,]=u}RU?8GE g G"KʒT vFG%_I;?qz|2왋7Aa>A->nbl6br(m A)[ѨJ8q1(FKO({UX|q>袍q@E~ I1` @sQZf#3]~UIg)m !qu.V*K69wdl_N7@-f0xי)5%1ZCªDsozdVXp AEBVe(RV|Tʹ֍t(!dξ=}1s>5@#<~JwmB"M[u#X#XC?/hYQވDZΌB2l?º&,9EAT9;-, >Ul s, Oΰep̘Ls䜣H~G:?:FxRDiOΑ+qQ1;f.zygiW1;W#\BEm-Sߝ^\?_KT93V"n;*T0?uNfv^8a#tdt'ϏDɮ`-,.:^xC X5W*rS5~-c# 'ub>@|kqkAEn 1sS-R`ZWrTarJ?~8*/V>Q}u|Tg,~zvlAd]wٶOa5e}f Sdm˖u?a?^iSʜ,'Jo&)/\3߮N}k.ֵG+u4ힻЫKk>ܭ6mm]mw]mst^ V{j> 9+<{u|w/VGv؛2j[`DO&:45SdI)zSh)g+lD–{dr6j&U~iIyҥ܏,%Noj"4YWKKiSWJ+X4dه$zἋp;-{_7}Xuux=T~hxƓ2ؘ:ϠvKOߟaֲ~\kwq0+݅sE6 71Q@@YzK업T7/7 |u k^ETeM޿J2}L}|/4݇CZE}k"^^ڟٮ<~.K 7yoPάn{Rg]sa),zw>I w"Id(~# ޴Uʀfy/̷;|s_{t/&X/Mi!]nv=uOܫMkV\q7Rͷn5UY|B}TTH$V1=e& !k~,4|A5_fe09Ossd_\|պZBd8]"IڥhTu"h[!WK)Sqb=KzkOS'n_ۿ 88*o@Fs%ZzrS| gKEhFY]8vpCI>$PIFjDJ" R1+hۼ*N}%Nl?AbZ2%JPTH̵1Q[[蛦V$$rrYkѵb$߽pH5K1 9Xi-qɚ)Iq6rVR-F$\R|6c3~x^LJٴV6۬j-'|nh gњOBd-3Má)&NE{!mp*"kd C1:BQ4 !劁V])rl^wh0Fں{t LG=;9y[JLhiNK, )٘]%A>>[/չx9kQW55L,yf<*JI]0ڪQ 9$עNbLlt K57DWRJA}A_'djG^%4xNӢu1S*AZrUyG /1k!dw"6^ktA,ڳZfPBL?8.U7_*KInGdjQҵ  hiEL'Xc[1@Q l|SZAl#FE~Q7@Vl?k(.һ_dB'ҹY"pmVS]Ns"אul"*[ұ9uִL%`TC`6Yc{e}֡RR}1TMPeׂV+MLsscKO[5pn\HQxXRY[e3,(8ˠ 7 x긤K?xe=X"qWEmTz 5ɸz)P/iw\k,by-`&Ď \"1X}1VsI1)yK@T2>A.[*ŦKuR A%dTtg?%a7k,X \E{*(l N? 5r {fS+͢y5 EfIcE+e.@b{2l {xZUhVKp2yA>k{1/UQϘ5Q3FA6Vrv'A"/ d߹a w QA#W2Z^͜A۴I|txm+o9/[БJJhJ BHhtVsc (Op(v5.j,(t_8gAjI(|"2H&jZ&#@<ِ+]7zѲyT ɋG%#A Q;76Tl+^X:(]e*yF*]-yn[&Kf 'QO7D`v~>:!e`$D_b~VxA^悩۠^,s@ QсwHP:aZ 7`,hX4댵ȋbFƈVվ%D5@"1Ts \.Z$ӢA*22*P:Z 59&TiF3': X[u2= x3`+Aqږ%ҴOY"g߁Q > U +PiQLj0l:6<_CT]Xd t)cthb80'yؘ#ʈܤ:OT]UgGl@0\@Z6O] ԭv(F%~2z(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(h(UNJ B|FJ (`m|wJVa(F%oP %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %P %׫)@0Q\~J XyZ@`eGP %wem#I0=>^3۱}8yصCSb$)˞Y%j@] ["@ 2++3LH H H H H H H H H H H H H H H H H H H H H H`P>!LB$ ²H 4 WDA䦹Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$Gy$_ txW:Z/55\ip@;NӤ(|5[Kk@&]%wO }YȠgI[FgfȀSVbsZ̞Yqz9MۍJ%f 26?t'`T1ә)"Nn2fφ|5K'+gE>I7<|w#Ʋ& n1i&0m2´INY0eD8pj[-k o!4ŧYl-ݠ04]^Ag[jha4 `1jlaL6^J Y'8QZr`4h#!@ȏX8RY9 ,_w B7J2UMB^M\g݊T(¤2h$j$,\VSBrm7(D"n>*E2쟀tqs8`Zvbh̉zKy>'!+ʽ;IV#/n/!81Lwsb޽B4%g z9@E [bi+MtCf( %e)&<LPcErajjF +Ue3J9đ5M1jw7F[N׻l4\8^^.mpSOݷ˿c.$J-pє37y0H`tRJvN0JU6F{-NX&%"$85)+8LJI)TL ՑuR ) ViI*[g=@-,@/?~Q}qž>-G})4rMp=q&ZRan MYuHV%H|b{igH0Z48[34MUyBZ%  +lz'[՜g>[!D4`q0) <8`_o[LLȣ: YyL-8 Jc^1t+ YZDaljm^ %n+19X8Cf9"nKZve:_c"aVĿy~b}^cYXZ1n[yb[/g A+W2+RJE)UR.MuY""!ÈP0R"$eഁR.8 S08"9 Ͱ LR+qF,H%,h%RVm4AmJb~#Xy`O85`Ѓ 3)á&IB^ G%>C&drC<7c$rVMeuw'3^= yͺ̜Le7 R$(Q,K1qfRX&X!5T3J8h$W!(E,[M0L!/zB{0ub&ƴa9*,{n, 5i7L3~Q,cG#NA!QL" mo葍8'fa$N>dtt:lש$UaLJ$N$UN"(%JREDYFHR%Q1NCVx-`aDÁ47K~jG 7YKT=6%7w(/Fӧ4l zA8&7Yo'>ϴ[͒ ܔaL`jgkxtfMtpɘ~ûI}TK8q44qML#;m׼/HxpFnaw#37878!}115k%_{woE' tSR zeS}Yk*d/=_܍փGx65M#r_Ƶ g #/7dnjw;0q u ՊM37$qhQ qJbe}(yw94RHa%Ta-$#NGK1G35i%%ҏdsİ?U>l_c.^sPՃ b3CEDYXՑQʀ|A"!(aJ&8R) 50BC 9h|%n{"peL~ԞvҮ/sI>ܤ,~wW!c'#,Y?}1њ^-2y nhGՉ1H_UZӂ|vvɂT nv1uM!>:NBVǙqN$ +N$7c{|n|Qֽ Í;80#;|f4^v ~2י֓IN,_o, [A NOhxzbلVݵt5?30ш~\@ڠDuËOȂ\ sT1Gg{jD/Wj0Hrsn$nokp ΄<,jwm\ 0{&qod1_!]ěc0Ij=$*?QWZNԞXRdRFR,o+t:*)gu4x/F.K)8Ȗ'L]4 t~ԀAE+ǚi[hp:EW S#I F"Ҳg jD+g/-,|F;]\% rD,$c͑SEPRyti&x,`yw'?6~lŽ x KF0񫛌uG3bsyxqgn>%aEXL'yUYGpԀ!!H qC((A փ)lEa.6!<ާ?\%eR%Vy37'6݈z[Sn6Sr҄j5k%qFTmUkfp-Lgt-h 8IшhKO'*f.vjn LqsRg϶ mvGffH==ҹ؈j~RH:KIX,\&2(~ϳ8n4]G OVթv}Oȗ Ro{E#h!:\*D!Œڥ C3re^~ 2V'5쌎3K><0N !4  Lj%kJ^E^RJ[r<ȳ(>s҅ϸף"H{&:BqH4Mt!Zh8~B\KΘ萻#i I:rNG7*@@z@3 #DZruǧUaYРt[LY?Ŗ 86żFIPjLQnYp-ރҜwU:{T+IݜDQ k@<{=}S(&E˓lι\n\ᐛ9 h=#Q*}u0;%Bp 6=)aqxU(cԃ7 -0t,ikn\yq7_[L3L%.H H q@UUϑRq^ R7'4 әc _O-V|j`AX9ْRhXwN0p4h{ &|f>_YB'g ϩ׷e2&ysUK%z+GN)>_f7Huϝ(BG$92@8YBcu_2dYK#PN'L!ZRe=?m"xq!FsG|8->\dh6J'T¢Bgi*S9wz -9ߨ@D\;<_'R)u&,tOtbf͌;3GsCFk=:__e_ӸEܫn-WbW? j 5NTTqh&@+"$O%2|;tZdz^()$ˊB9)!6Oko3X (@4 3zH &sƲ=gnnIr)D 1[]``񒖩BZ_I*QOΦA6(c8,Wwc2}ŨjxrU"0.j~J* %ͮN ^hk kk~p("oFO jD&Hs4!I%B &. mZ^v2)km"O"Kn :fby=ܸ[p [~: 'i794g,;b AA3H; L^GJ.<ňራ"MKcSR$eWQj0NڬGk0Ev^^;ه{躡(iD7NJU]2w 'm \3)ϺǘOnE<$`|qE[QǺ3G7mE{荆Y=j%PΤ# hǒr- LOWl@2qBSujfiGP<(4"M̺bPLNMԟS n+5z}xͲ .\^;w(uXj5dP:O&u /C8pʞ9i6]caXktэU㓜 1ۆsc{xFpɖkG &颞`*! +DDOj)ip u:0uH#AE{-Mw5ě 1uJUu,Juv Pw=coyrHS&5?,Ad>l(ºdd Ewv">;/_E\֯X 1RK{]9\^H[u?ůmz[ @ @{xD} v_ߎyT=pp-שA0Cϑte'  0Jga!c^Ê5Qg_'J4myVvZ/PRͶk"c^O}ǝ+MS)drx~VN tm$ލi2r;1@ 9֕ww4YAHA_F9)zLviCs $QIWCY hANR/}AQM!9NՕD5.F ڰP/vi!gþZ{X /tc:kFg0X}ҨSIR5d"/ "+/XyCjDؗ>f֕a QDsa}/M F"})J:9yوJdRnW1?ftfaSTwD;Pj#tp$䫢jܮ%c5gY|xt4m':7 %nX y~0+EUGl]FKfVNo])LagԄSSΚqT0趻[رPbpYk ω[wĈ^nFHc"tvהTwe:{ymxod$ֲ<}i_f_Ʉʳo{_`pIi"v$ԝ`6_gH,%|OxĜ3G7Xࡼ~iAn>5 6{&I-Y~&i{渪|^C)|>lj|LQ,ickC|RQ,mI&RI4rKmA`] T?"Vהp0d2͋e˱c+4n [v?p0M q;*d=5 pj@w'0H ֦Md"+Ǟ y:1/HyIO` MQBؿo|ug%QxX_&Gi -  Vɣ:G9"\+ *iHNG_oiЧR7}AKk9١ .h,tsק,j 2X)Wf@n_rJ.0o߫x9?n MJԇ Ib"l8K_ AhU'` ZB 9o}yH}C҄RAۗ}ufh1$$8۝̈́L;;7(f<+B)%>}vD*| a7{sG"0Hs4|-8 "M@1hXS3ge 5x̯~MTm߱˕!X2}7q@w_r_d%2_XɃ?.ĀQUTU Y:y(&r>i$o |3'63_q/=E#+%"Y$FӴx$WDlHx88Bo1G b:氭4Tǣ1h ƆΊި4獩!>l mzS%<.iϒMZ_,7b^&/7IEs ҉\#t:V 7e=np]ż6\xp?@4LNi#'dP-' (iG̜c :k:gϧsqD.t<_9%1wd5R!OPJ!(p M`YjG*UK iQcj H&~s3=FiUue42z/?-W+oϖ'gZJ]a߽JaBSΧBsd<YSul(<3s$,4@#Kg<ۄ*TJqt&K'l446:h`ȹAQn4\doNQ-+ [MGc˩=җmz]8L Y<*YȲxRe=? J bF;Օ .|Ss]Pc,3Pxo( j%z(G $ߪBl$%e.3m~q"EZ9gI t̍|^7+dJe)L"=SN]G(>`MUfƋif$Rs S 9 H'))^Nȅ {+YBLP9i\WO_&iWH(P譼wif>ldŭ7PNҷm"8+:rd4Si.GE${'>Y= dsY/O*Qv#L,g'XY`Ќ@s-U8Lo` W+|ZЭ!e͇p`Ţv|dNʨH>;0ޔr&O)ʪwuDsv[0Bń#ȅ0!p4tr 3{=!hEŞϑѡ79z{S 8, 9!"/IaZuЅ6EGsq'R44n绻8-0c[#Ʌ"Κ=W% &('psmZ;#ovc~H*I7(~Y@qhvͥ,x'o+]ųƐ7wc^m %Q>Y$jd`Vk:T`<kqrη顃2Na4[y~yGm: rOYWg tŒS*tW06ooSd$Ԫː5ZL>KQgPoQ."'C; "u2FF52X~@T7|hg̊4XQ]ƜtZJ*@y;4R\*F\!&\tSmaSFRP9R˛kQYnt/ΓtY YǫFFOj=FI gZf"y@&R12Jss|(C66hv|H<oP\^cḢHfJ%N?Kt|K*t{`ec^,ƨSDKWw~9doxu#۝ *xTutc}8o=hIy~dٌ!P}K԰%`u^Z:eFBc&Oh/O(>,%X*bGt%\ηjlSP$Kqg52r8j W@2 k 3=G&P|lwa1)7FڋY^r}Jc#ޠ8h\h>j+тh_eO}MFA=N4JP <1< WտïX|5*4W ?4y%0~QiVnﱨZ}f_k1WK,,kr_^ h'm|\f/WaծGc2}|Wxt58q[(7Aı mЩVHȅcJeAlҡ]c0,xߣR8 ꋃe:-FeZ.:{-n'S2|PS$.bpQCP"MM5&EQ%JvPiGinLr+v b-2'i틂dD\jOVH00mf?ra:GdDH//(jIQtt œb?q;vx)+ypǼbC.Z/KgtFN<Bɾ|gD%Z+==>|N ;$o+bd 4̕'Ys8$?acߠ Drlzd5aHؘPUQj>!?QT#p}D|3& 1Ԍ ỉhv 8I/Io+_'8M#pve&8K80xTʂ2&\D;0<<*-=vrm[ m xG ށT*Cm@kk5OU*an%fYҚ 9DQ´ pf@qå D ʂUtDžRZIӀ9*XpU\+4$`6==MW^!Dz4@y HFV 9c8D3K1{T~,OB/dj `ogⴊNKE6}LsɐS^ N5 J`k .]-V2%@wȋ\\l)鈿I:/bX% 8-.bjd/s.Dܣ]n̐U ~Jk8%w1nPx0 u2p;3(ԼPϋAЁcF^UA…=ߤ1#>UQV` !=stߞ9Jp/Ԗ!΄DڻnnG.1Z_@%r2'GDS3gVI:M]8I> |L#c @)N*Y-,x15Z /r$m+!\]2/"pN)'25n).Jq9352>V Z𜋄D-XKsEq-D3fKBR KEˠN&DJ*\n&bE"(Xbá9b\m(/kwj C*vޅ:˓0L|Ҭb^gysk;͌s(bBF F D8AKi1`8qs1YWuVHrBf8jȁFF/r~G( fμJ0*WF|y4IBScL,2^'z,]$97 PRpAe m Gk PENX=}桁FQ$2w@mcP18VQ6Qc/B&z:-sˢNw_xiIА|1a,j5KPPO+픘gr:kP(et:M} K\OxA%+zY#L尺yI-wL1#0i^8 3d'Gk ]j?cdxVn* U6˥fv }k˰CD7UfԨVo jd,i`:M.ofd3e[T {X.ս-rꂲNt]C33 dbX#)AkVqK`ٖE^,mpjQFlD:6s]NmfH\PU5FA֗ՙ_ʣʉlI5i<>GasSCu|{X+ge{R$y0%wH=>]$N_`YVjd\F^jT#EN4 {c%nǃ\ 񼬳EHOQgl5P:ۗRcu8up2Q9|}9r}ICY(EZsBa|f5Nq4<>W\# CL%ӒG ǿ-s\rVʍ5_JL+[qg4y{2/c 3*߮RB^ZJq/KfЧUh r, )%Yߐ!%46gmg5VƆy'952z1?oG9[x''OT!8&Γ:;1(B0N(pǮ<$GT؆it6c\7 MŚtI ֤Ly H—v^,7Vآvzs348dm*mA0O6vqA/w֐bƚ&F@]rb{A1?,Kp=m@z1қ{4tՔu >iGY#Q]i0&wVPn҇B]9 4桍 {9'R2Tp:`]V|FF/Mt4w`N=R!{pb䌪WE7!`Aѡ9NǖZȃ`q=_~6(J1dYF@W/)p yQ ]S6&e^r&9=Μndux=]5/}epnZ(\. 0 A&,QC!-cJ$6A ?~}e3']1̄:{|SFwǁl#&rG/}R+$l{8\i tW!&T/c3V RR){9bڨ :Kh;غ][$~Giᒯ98 6t*)\@Y)TY6k2\ќ{#B7|id@Ii̦ "#w-W}]Noe}]Z}-gηPD2&. in)LJ28~~}N89V^o,)s&nQ80ݦA阀$!i=<Q,p ^tŪ&e<Ѻ>(5t/pcڛ"Z^+h@F GF9kFpvDH`/0@,=' 9zo'aS#$ זL)'ǣVjDQ~_ɚ#ƍM#{KR+xy-E++"h5n77껈 )ׇa*Y.[O',0|dr52'\N4`(i7{WFr 2XaX$` ,2͇,ՇLIW<&MV)՞`tUz+dLx\%s|4k2;jM:\T%/7/f%)/|QB+Izͮ@/m+~[/P}2 h(kX>v@O%sҀ! q\G6F܊&*MU4ށ&;6$*gU+ps31mu&xr2, }6"8h8՝ `W*r%S_ :-4<% .v9M;Otc`|>J_uYvޘDP4V ƞ.tR X.߭$4D0ZK¨fvYTe`']YyY .(1l&J5)R 5aHIh*T@c폀Ǹ>wk Gո<2@iuƘ^O7lqy1cKo'Syl=ld2NL-g٫GدKsTR$>p,+Cie*mV=f5E"zfU Gg4VK#4 4YĘJ,9/#AIeJ̚}ho!żam>dc]JtBBAĦXxJq [( D3aRG+nDhf!QShc-|2LfچTELsɐU@F+bs JF)C@Y [[dž3l[,w\*X(c]@K>8U(_lK#) _X^ྕ]FXx3啖/edې+X!e W{V[٥ L|~[x_@qR+ m?BiDA 4'\)'ї$~dOHX/~l#^TQ7[mj ߦ1EWE}_]JJ.gnm\" @7QY@NjvFJ"X6lYP\ J qt4akZbtΉH81֔%h05]s`!KmTs-zv2Z"Z*s o&"T][-P/E[֘Ts :c7JE׸}NȾ})/n-&4U+ߦ#\'Z|`i=o$IS|)\q $# x,hJBIAHCJ)"MUCǚ3x0ȈH[v>t[zEy1QXWX- 8T+XJWPad{y{U,}5"5Ц6D x]4c:ZyFke~X ,4=g??M {(87z^s;a69a}OMc 0`p>Li>޷iL?.Y ^`;SK1]A̿$`d > KlX3:5wQ"b0ٵl,AIXb|;W$Ys x07aX[8 x9E^,]o^xMVgO]ؤ(&&qI!dҊ(x{>hBQn:Q}6^BkONa`krzy)N9P`\]p};&M)3%NGRt/v$3%jK- zRE}\QϚ®` ?YS%1J /ΉOoW~n%<㷾akvV+`r /.9uy|E'Oz(.L|sl?io>&p<2K2<#k&E_\ݾ0<K]ȉX1\e(" |5jJV.rLv>DHk()2UǨW`nX;7614qyWsǣtVD 6ծnEDbVD̯Cw*l`4\Z+\CqAAa"jp>nZULPZsF7ant0O*S|XYl+ ?fcς!Y%O@~ ?~Lk?{ ,pf?n(^ձp8s=aނzɬw)C;ϴ _Վ&%߸z=6~7HKDULKT8!.0_*뙯 M1@:=y%0˭H3!3Z2 ۘ$A፛+x֍DG+}h>;A|P ƪb/9";j pթ.q75Ca] Jt]îp)Ej(T8RJRw~~K=k^]~ݽ ȭwOL7݈#_@J[ ce}tP9Չ2^&3 &Bm "h\ai^3Ogte`٘Ά܄ YxJJ1ِw)v0%s)kM!!r ƈSymcLJ=遈RfL/|s_le o S,'Qƛ#3>pKZRt Sb1\unAaøw: .I$/ۆmˠa%%ErL3O/<#/J]@jh#Еl[SyEGH[!L$[OT 2ΛlQ oUER#vIp=A-a7#&+%V)TKeJ># 3TɗOyBN*|ӢXOeXؘƢwl.۞6櫉"5"r)9[IeMf_cU@4x" JЈ{cCxoDqd( F7! ;|AXfZD>Xݴ#Gުc!tl͞Q⊈Z¹,h"nM|{66 ٻ6r$W w c@pv܇. J۱d'beKj-nىYRקbխ %" i{,+Ry>{sECʒ7o˗,\^PN7 %eN!fq$x.xg3¦ΔXBv^Hɉ+1eL(R !T>C j+yhWMTwýg~v9\@LYѢ?|9] y2;ޛރ7#h9Q“txw|1GiE42Z_p$d!=tA3h"rcTx;KG߁zIG{HӜ3ȋHB!?#N"7&z7I"oNJB#97Q@TE$HZjzAw^w'{D)}&D~U/Ӈ*)}.>TB!bU8&QE "`$(8i7~DHTX͔VFfZ95/K>F炦F=A=xSĺgt5n>nK]d^|ȳ{(Ȩj 7^k`x"{qVZ>aOu3n2CqكiG}c L[M9-W }qվfq 6?>N<[s}{L\guucpSEmpi3G e^\.Wr\/|Q]̮kEPxŴ2$lٝGo4E%N2&H-ƻVcፐ\4ҐYk RhWiZ}Zpws wmhj%y;"i!3Scط[ Fq+lN$x (gPq+l koU#A ] cL9ANA؅j8Tb4wWM(d.(TF^13QlxXyYL{(Y=up>lncH'C[&pߪ*uOʪTUa7Y%r]ɸgIɵMƳzp0Rqviu?=Fp̛9wV>\Cm5yQS׮􏙳ŵyQxfX?Q $1vtGmQ=Os3|h'KPAZtXg^i ?npug'^殻7= dzӯqXt7|.g{ yTgwݺY,Y'l+PINyɹ1t}Y._b Z9^2w!Ђ@{B.k9>d t.WWޝvy$j8^ҡP4+Khug-G>rk@Գ@Άlv!,PDpU0J uGpHFC6CBJ7֖W$vՒX8!-IsĢ=0. bןtiR<ͤ<"ȗoj{/BjNHۖUTJhD4&q9I@jËDbQ ϓ2aҥ, ы78abI`IGQ<*n(b1)qKHeI]E_Kb8.p?,L;8`>]hxA d6Z&2ra $*Q٠8eکF&B$QzZ{tDcV', $0N@pֺat<e "UQ'4YZ &Bdq?ӯ$[Ec$(JM8cjj2&}oxwhuy xd z[e*Vg>kVu"ɋ2s/)`j\Mj,@>n\EC^+V~[8OR\^事~~9{+E!?Χ7j?_cğg\p `Vz``r<"oNFop_ݶA1s6EprHA-om j]̅j$rUԅ|h2{$0>z$(Q?ED- 퐁M)#&+-qK"^|ÁH`%NoXltXSF14>(Cwg -)2,o f*_f6Y͚D1vzyII;^ "ħ[Őlqv;R%8k O'b3L:'k}k!]T%Eي>L:/$QdBY)MY˘QZ !%ą=i;2;Xb4YH\a_&;Aij78GwD»gԾD@<«I@*.b.y_𣱼'U`5v]JfS{NjQAi|2hjrIq+i2%&HpJD5IVi)=}4s4ƻBQ1bLCՒ,@*LCViu#-e4W˃zPIԀRmP; A7rq扌őNIT-17M1Kދ8Ҋ)]Ҋ58m6O)>fg=&.a ;Ç LXbf_;+Aj7jV(qYRS\T NnaJs9FUHTE(Ǩjb(UVTVv ưݾZV}aZ*RZ0RO~o4,>Έ &2dDv%IzӃE5]&$oxB̿Ofd dD;II{d2:F; )T1kq-— Qkq !&(I-mZ q ǰ;n辁sQ4__C&V_M{Nև 2$D\͕ *`KRpeJč3-)v]O@9I>>JP56%v>%j\&~hW3G(UmzՎFGڠ4:4mB ov ݽВOi"5>{+a[XBi XȮ?O]uվmz8.q|*v5e?0|kD52;ot;2֎8 0Eb[ƸEUCk0c-?4ux`ރa()]2쒼$pb9?7{O72;M+E{ݘE?,FdYUjO$UT2)uX2iFfDfG E&s qg59wV'$?( Vb(&@J{R9;*- !#&6¯PѻmwXZ(|hpͧIlFD阤FC^%.JW&n?8X]O c3{\3=m4Z0s=ڞ60:{j a'h-+,cIbӊ  iPT V#a uaF:@.7NU sUkk ^q|k/|[7 g-jl5džIuon/o?lm@Ly8rpE uڎa7٬8[>%:F}8H|=};2C[>{7e^[;?kùJteE7kd6z Nnב5o)jhtL%1຅?}hw"&%WCh8$O=T9kQ UcpRZqcola׿= VsU[;=0䫟ǣ0??_yTT|x7D b#aZl|eϏcsxqL+M''g"58lSJҠ/I~2H&ZyN_] FZ)TϙBEGamKZ-)To-*!t]6P5EbJżW<$Al%|M8%hdDg͵R`6 fz#u=f[O)FG_`!->YyRt ye޹DN`5M6.>OIbVeս2lR_JNn|Rp:O:86Ɯ‘xN$.q RҤ&3U{s X\V q@hl*uet``9gZ{L)e Bf8 L-?įn'5[W LĮu]"Yf>6sn0ɑrba-da"'${\0[XBt1Z}_  Ѥ{Ja|OX *CJBP m@Ȃyc Z>Ɲ,1_ٗ!ֳN87C%zSNjcsW}sSNԮ1k CCM8O0X^q5| iALH^; ļ4++L0 0{zuծ"6[oqA]gZe:AeBQ$`*tIl[WU,yPmq2ɿ1NgFS:͹ԩ^t廬fRFw ;%J s't "ajU| !E1l  2|F-ϗo`~h13쐲W$=q I 3bE.ocLg$hysqrzuV7-Z@zr.RzTY@Du$y,IJTEnLyhtf.ceJ.aOPZ[ 5XWӃ9Og6\PyDŽM!rZ1!RH TnK!F1!fLio`+RH܃\(i/9fr&?b(e` dYQUwn ҞP;-!"G˟,m/yP3: J7SS/??Xz-vޙؔіo/nϽq`O @t.sbk/?,ˡ~5ɏ$`艷XDW tdbh x%/ GbiH-x/ + GEC钲\*@Q8P+#BRHNJo -˃CO'2:#R_юUVģAKP-Cҷp(|Z20r F^ p ܖΠ1ZCpr\ˈ%ټE^Aɨh➎{G̪BJSӘFol%H4aTgMWYp7v67+'q Ėwr^-@&w?xV`eTkܝIG+Y Y@ڎ::l{&r1]4@4l. 7Q};:l$Q{R msN nTFn3BHGhcv^qmK+ ,d"ЊO^r[T 5Y{Q$}>'[E"Z$i)bGim*'$0nt>~Z4isBƩ'K3z3 Va6<``Rʣ뫫~K8QQݽA~qƞ<յwfw~q\iIl,|A_.S2r5~&T?3Ogdz<;mIW9UP%@N\ i0J.%9Bmh^eDZo2&p9 H06Km4ڳza["@w|'^L35i3M1^)(ױT0M- ]`L5CВ~F0ZdCo1ʝFa؛qN rmw 9*)d\u'ܼ7)κ$ V7&F06&N1ĵ.[Jѓnc[v zAn 3أ]3ݿ3̟֕*@Ow;q2hWq'+fn6%%I\%ieG~ Gm%ċ 6ۂN(K4 +GIUhJCt1HM䷼_+t:TLcD&}L~&LctҤEy 8hHh|t"Ӷ b+ˆ(vuڕ*&'MsJuV֔*(u2Ak ` `-?ca2Bv)dnpc1]]i=j|'EFwCN)0]&8ɤ+9J]^V]aDddh+VUtEYʂ@eK-8ޕ5#,\ $ϳ613o;8eْ%AREdQjI H$2LC&eLT&Te#Ns\\Y&~sV!2&_4$u" D f۞2LA"?=}eEvʹ(x+l)A$RV7Q@ʹl2%UlŻ6 ݑOEG 61*#7L,,U[pU60"b*W QUTU?6BzuV7I^e&Ð48I\k71BElYPL <˶E >n;.#vܖ»##wց:3 jncͽ`L_\VRnI5y}Og{J~dk,?@|ny>ׯ5_!ymW`_S6N˟!/fqG^F3!f~;ozquٙO5#2=ytt\EdeEu욂xD0֡ShGP@ص+G} )؄I6Fjjozƾ?H* @9ޛnz։>g! _rT+IR57K{2U'׊Q߸F:)ʙ4x=*ĭju̱Jy|v#(qv7r>'9'.4ܮ GSWyZVs7Մ@ݳwdjw?hk3NnwGZ9ն!w{At ;HC:xǏ]hOXVۓ?cN=~x&7DXn,f A&gZ͔9X 7pi' 8\E\~ʞ>7z˽[H7ėvly̬{`g}7V(2Ͼ|Gҭ'L8ofj*toFrr}`f9Nިu{z6!shD($#9XRO9U\>A7_3ww\y7Dp:c=4k&oSa… sv~>[41XRndн¹ۚ(9UZ#HɘݾuO$("rw|6sP<[BjU4EaC)<ێ^dB @y$&DzZMrtBn.A)sɁc8]) Ԅ vtSsBv R;<{6z~xBvCx祙 Gv{‚|<LwrV>KJ;Ϟ_q/5|@tF2'IXȽsJ;԰w?|:f"\W^1BP+%$Ǘ4nQ8zz+=Y|GMt2rc{;9 82p_AR'#g|cVfߚүc>3,Gwza"9(Rw1rZUv'Dsu|yuq":1f}gږ1UA?=os&1 Nf5T3NbXeƏԤo2II2Vuz1c~Pl[](=mUw4j^Ww Ud,|;VTZ=Biߵ鶔Fh>m^w[; tu61woe73O>:V>^z@+f紃z[:7$vb>bBIaL"}J~3{-?HJ, AIMo[_s=W-Woƃ9G,/Ͼ,޲-zo$Ct ,#;mK4жi*ɂj0\ӾwR6fuN R>N5ڲ2sX}+OV&f܀ ~yy :pBpRI'Ed|}%eaQbG M6#YH=4<:6x'N7=rF=\Y0gHA OK^~gc")M)9 ![ԩq=UM.J6*e$d]);ȴϔ )]\ճv -lbb]zClYl-KPᐈB AswqcViDLei5! ֜zj5F;RhqXV&F e6Ai9&E ;j??3o=YW|.\hچB?.x9bEzEZ-|,dN(EQwFc@%\DKۼ.Uq00 2exX b$wH$-E 5rn\`M(bi|˲T%ؿ`G56fZTVBv Q+-i$e-=%^Z' LT4, '*|(-exu.*iʙf`&UlK mK9qN#c¥7ɽZKqWa|l ?4Rn=M^V'ZC^L|.gk-<{\GڞlT5$FYBw92AĐu5hХjYt/隴(H"IY6C"cHť3d-Ḙ Lz)@]R*PF ژn`nE2BXY??OKDY  v5 pk nFXE(Rc tt#`Y;EAe5@fUi!1a 7hm:ՖǮ*4V)y7cHȁ!x3=a\1EHQjn+?cQmkF2X-`mB)o [-BOa$/3=-0\9RzgA&h^Aca3Lyݺ3KgV{[qVז t}ݿnzn͈8t3:D}쬽Y 9>Wms闃_3'}ˣGGb[t^G_qON7nƟqΓӥhG`Gxmtik~HMG;6;xgzKqOϏgR|E75-Oclߚ)GЮ&#.#Ǿoo>y[er.c>ֽ/%vuzivÊoPi@|{ZǨ7vr{n.RaaJ k%[vU6ٶ48a>7T;SQ6Q?[9J~9@|C[iIN)?ގg Ӭ& `7obBs#WJ_ ݖٚ*! ?@F@67f<%RdoNne%l/vmef᜽O܉-);|._V,i}U1<I-G"yS~ڬx ywyuVHb'IE4%"G;̪8ۤf}i>~߽}5W駒L]r/ʧV b5dgfN# H>U#|6ǁ H[ [WE)sHJ$jDKnI)Wjyo$qNmH.9UG 1vRP/$po9w$'OXڏd`ci}xwXy4 :fWRcDܪ+FixĊXcdرaPFk#>6A0 QbFA*jBkk5 ΊͱoK$bsF4|Smej~"^--->k~n9}U[u_զ}YƐ6h*$SHM]fc-F X_HqKHH %3N~c`kdS7D_y4FчMLӝaK9nz&S`%gŃ{;wO2==5״! {Aّ=on~\ Oyzwmݑz3bŨ2o4tȣa5zYh䞺Jφ* f2BHL! XeLDߣ8I7p]r: aڙ ؠD%$~RoR܈,?yS``4+z{zAZJbHCbT2V1haR̳ [v#v1)W:RB~/TwZgKƪ{#UZӇ+m SHV#ޑ2ہ(\k)B 18gɆeiٮ&C*s0 D+Eg&DPٔg]51κx&Ȳ&.twK|=I4ԡl1JAQLliZ@"ɫϴ Mv^p͛-T.rID%+|=L|-DN֞H|ihZ4~L4XkiBh,R})a~3ٷpi 7U]xN3 N'; I^}Yd?/e{ɲZ 7l1gS KD$N&BWHo*0KaǍD麍4'8Gܤ.<hMk1RIfIm3-Gb/EUoj5rd'uC#w^s lz?s)txoz1|ʛ?Г "3qzy#Ęf]fVj9钛/F=kwd?z!8~lOOOɤ;~%ӧ w{5ܟʝ~v]{3U\т thXmF:lJGt`{`wy` "vP} vhJP4כ(& ;ݽgJ1kr1fD%E<E?Nɺ#0 dŕT5HquhT*B(JԻL@f\jdK~W8OZ[-һbX/<(GH!G9 7cP HiYOVBF%VkuX3͖}iV=IJL>(>h.3%5>8ڏMf5) нNe 0pBYһ?Vi)9rک*) f%TcZüp! 1}<*Wb8t9Gd XEg ψ>GShscΥ66Ӯd}=k_X\ sX;l|\/fG+"DByQi葴x (A"GfӢvMhF{`~S/d|_"Wպ_.(g N_??dyh˥%5s48PȾdog'fʈ [+5=j38TY\K.BT=WDԒKSԦm>8b1Ma/m/:|%v)aɾ' Ԛ\h42tfqjo)vb]"S{^(8:Yϋ.ybBĎpr* ?a4c0,jڼ7h^l-$']YHVs!٤m$ 2+c-N\9f"mSUFݻKI^}Nϑ`sٷt8][Dڸۏza9 T\gۖ=Fz{}hvl̑ M\[f\ ^y7KV7lͳof R:~Ĥb6\8BH|(B|`ݨ񼺌a͗Qמ߶xs\MœJ+6Ԍn2IBp!}J`NK'?y{~TZkya%Np &-o]jRظ=VBA}䛗-K/34 Z0Y$KEŨ22D%؎>צQYdhPٯ%ԈX< o_Ƚdݳ:Wh0CX(Vw:ZZ7cJι:PjbitdeB葬Oճ?ڏ]ARۗ8HjL1 3rFCGF*)39`6k% 'M͘43J-}ݶs}f>keLu=#IT2jp@5f 9wV h܂WC6y4n^~3 yuӑaibCVG@7*D:tH[σMG4M/ΑVbI>(z`9"ÄSۦ9#SUhLz5p8 AFT[b#LEJ 7Tmւa"vUmKP_ fEͣ=.w] Fǵ8ģ|( ѾjRd4(C[Tl "2 oш,L@q3Kgo靸 ɐ1&qq6)v^= Sq:a_0h;=:>)|,0(Qi7ehhAdz156;:j6:yJF{]|K狖b',zۃGO k@_l#KU)֭'VJXq3?~If%K0L7w-Ƽ*18G(=(!MA=gzvVW2o6nkuY{Yg ;kyn_=4Z0rهO-$M/AxѨ_q1׿.X2oW'[][H4^~53pΤm9e7|yܝ}ǴKwӿ,qX˷5>)aY}k E# mƕ̫j1YK&Ξ&У%M+`ׇZyE,,j tO=NRXhtw>q,'5oC:_L=aXbALomFʍWs([ 4PʔG*=3O7cv NI%Y52 XK$EwnBM~Zhk;1\ ηP4z>~^:6c?Y]iaՈσ#x/0}&2|w|nMhI)5D6jBM7N_q:Z)KNCRKb'aX*R*;BNGbI/noZw ko/ 6T [^|zojpәV*wp-%?ՌŖ%\ZM _ϯ.e? J;;m{m.Xxeki40Fsx~ŒW\UY}ӪiUVߴ*oUV_Kg_bL&aJPe\Y]/䂓d  zw,Z%Rm?E F}be46sbh#^q=Nm)?x! tVlAnN=9`nN >кy7 u95o2n]{y| .$ɊϏj&Ҋ VPUF+b &FRf ede+JRΕ7'3*U蠄\ȥ5" V_ K B항l晄cs^~b(%J:{#ɱ{G3r9ꡋwͧ71/cH_XY d!-YyhbaNX96!Zx=RJY ΎƘZlP6k[6mimL~+uJy:PuwgUknmʪυb.)TOY` ɒژ\Hf -^jYwwav{K"0gdWs#JFz嬣m1F#4 Dbn'f5/m_ogyM }FYGVSڡlJXHb"/6TCLLYXcF9'JADȪ,U Vck.JgSހ %j&3mCk3DX0Ja_?a x)/20K b30al)Lg^_Y$ѐ,KP:TJ̦ˮNuYT6:V KnǫBe4{j"J !O"(D̈y%I¸bj[=J/T9ē ۋYk P ynyU:cBX8 0N.a^FUqyRiw];4Ţ!],<}/z=X1lQ X*CN:5p α 0ܕޒGkMNk- Q $u׍$G5cSqߋa.fN[,Y]L4o^OŴF35| K)Q}>*m0𐌰LAM->af,e2 e#[G@4'9WKHpl\#BvIȫҿ("[^H? ]V:jArvxه75P-F2oXdقUa5cD1* 6dgTM,0>+OB*=Ưq57vhjT*@trX !`EӒ.hO{_L%f!O>|βESZk}d ܫĆR VO3eCYt\ON`M7y~E +1 oͲت =bz-s@SW^<Ka+(-Izua+fp=re&=,2~:7vkFh5'r8yoa6uZ%ʔDs̷"k0<(U>ڠ>XQdf JK f;JLFѴ% h-a =&Gz4m'6 4lBG:1r}PsZ/T98T1VVN!RA)7$U<*)+}n%à'wY/>|ev艻>HzꉻBK!hBVvjS{V1-pl-^'6;kl$ojˊMz {ZN1}Z9hIzI[;+ҬmJSv~[em{٧VA6m!^_"Օ_>85(ô{L%-jdlI  y DnKn> =[9ihV'impv,g9B;4z 'NHY^TlO- Jzz?NH;!}HQwjP;lz[0g%KM3,1λS1|#Dt{dCD+R"O>Zu)*h i'*Y#oG=#ay`}ipHukٖgڍ2vTyTwu汛GOg~0pj 󬁖Rͣn };{\jZ#^w4Kig-~ S;OڵJ^7kԉ^*gZR0` dVT1% ܩ@1̔qLsK̒C {6WB#}Yqy[ cgO٢J1 D}[4"9& 2E9xV`r2R⩨j09ϙ ,ZE NYoc 9F/S3Ym&%JdὈ7yh= ~`U?\ Q]~XHEX'5lAJKj#`֚QPUQJUxJ2HYE{򞨋UrQg[Ǘ?3QVR1ejzUw;WO> ј zqъ`۔lFҋE@M*@_Ʌ\IǢyP?{O6r_%//L]=֓MkYh^39V US%RR%Y;HEj %)u]jD x3jjcئ"+P(NU&fTa3:PBVaآem3x:DE i(`=^I4ʰд (  I X/7~'xP⚉T*DŽ$6lDM[n[Rz0j|&EW<3FɽgfJ)4c\w?WT3s>Qg\yL<3"_thTsH1#3e/SQT]]u{.B0J4!ZȪ`wn vwXˠ0%ċv^\y]WD;D>QnBER!+mv^߸ݩv|Pv4xzhEͬO`DyѮ#+EU6g*B*QpN BPTFE;/ڕ7;E;Dg(;G}pU+M,|˻B."/0ކǨ(Լ.^a&H0I@ MTuwԛpKٶph1J({FNaDuEXUC\iM$ś^y^EY~( An8ƉI\mxpE)д5'MSܰ֬^(-'fI.Aa-ыݻ̒-1+ag3]7)]2h>=KFb{jʐ*)/(/(jRCن޹uQG`K#ѬhHhGF})!:,3I#ArcD ho.o13mk~|Q(lD=H)Y0% _H4ڂ֬6Ȇo[U𺒹 e3SwnDb@ITiDĢpEcFBiIu7/s_-R5~;z%mx.N z6O EɄ.p _Y "foCa~9# M7+?p9%z߼"G>%LwjUpA-"z.W9 f?r~cAa *PW-/eonլ-+ _a|Pޒ$7kY\)"~< R(9Hub&HhLBP@ f .'򞝬~-t FiIFO!yFwB'ME?z80isR1yo]k ߛ)*8dJc0eʨQ_!MuxVIrFyl}e;by٪N(ZGa蚨{Ic19G@v_KtѓׄZS)eТ K,cv\]>MO< _n],tqҦ}Hf+QcH`=z ~s2 E /'$GhL|&4: !Mv Yiۻ"|xT۞@X+B$V H4!PI7k4 ./C*a\(nÐ%ZF!18a/jpui#I6"Ux_q;&ѹ$%Vxi.Q3zNN;&2UT]DM}< R@eW/U:5QRn&n`\42C0CJ bKx[c$RP]P i((kbnZB%1& H,$1~.6U7AIt18S]!:e'Xm񂉤 DMS<'}}ܛ]O;7\* TA-Dã/4q)g(yro _q2@8= #|JÏ~mvAT)7탛o h[nGl"M(^P?%"[G:R^ QV)ր)^RTc?z k(M3cr ѼؿrPek9'MŅ( ƸT0 DOs:.%s[tN,ݒgߥX NR;)0}o_ CgS9GWwқ<32<#B.妸Mquᄷv7OpOYpt_ƷzxzjA.oIЍ9߾T;C_Ҟ%bY* CwV<.r41ZlA$#|ūxugwIb$mG[-)^]mBgւh4B^m ËJhJ:͠K95(W: {EWD;# I^؜@peBa$^]SL.z9s,?T!3JﳳhO2b4j cdse|t ]|lhx(!,b_+bX*!=q0!1'*U1xJS]+߫w@k.7zNj&Hq/?wfVu.nݩY##$S1cJ?OQls =ݿ;^89Ys$vlc|a$"iSQQZF(۔xnQJb&Q`(l4rY!,&1vp!\_eFKk7 eqQwQ& qiL6#͋'_?(4;+;^93`ҙi gPkBV1r b2&IH-J^?N&kic1DHRWJY'#D@BA㋴Y,HȪZ&*U݋)40|7ۗ`$Ƚ$֢C9 kj"Q !5423{Bm g&,%Ucs0N(]Zix<83B!&( <|q}8YQ2Jf{snN޻7?'z>[(r\} CF,?:7~!=<{+.V>h<;o'YW}ٟ!Wٟ3;uvƪ>4/8Q€,$sX2%CӐ&zQ[>VUSHlc'1w:RPIJY52y*,UYvQrqIx죯e6 Y Pe Ç7E=K;nw\#S:/ƵPs,2m g{CRZ1ntdQQSHe\|Pi3}{{p|{O_.򒒋Lrk!|gq[63\<&xc $c5,Z?~M"ﭳ&Pqlz J?EGo֭ۛw-U_X.}&nhk 6UԄ}5\))~)"v̰b6F\r)S?E>%NC&ү":Ep(sTk*$AϗoesoɿztN~JcOUϋ Xzn=r0+4ˬ"e %AV5sM:>,ʶ3޾/j`e0K;j0<H(xQ-  ;˛xJ=5ki{NiUq*'HW`JRھt8Tt!X^ܱ.ɵ#2%+v E~-,fLpNU QȆD] ĩVoPBUS+90E1>#̔Dt4f땙" ʔWa'?Q'G;aY8nz=0CNxjUڣGupH3F̴jI=*&qE;V_=d^hsN3#H J[PH~32Pd Arc!*go>&H%̠]su)sγ^NRr,Y_w4%;7 iFHsnTaNJvX:PKe&MKhSh|1~%-3 Zoj%ArgqR ``͈#eaȅmY5p+c;¼19VvWe Ҍkʲx<3Mp*d)DqnX`}#{ {rʐF&wjN!Ro WL`g JjdJ:9lol`SǺԬs6́Hp3Mϐ: ClD)6` K*Wsrd,LQ+a#u;`f>Yw*n)եf-6$ g SkA#KOe !0"XQ 5` yOs`$b3\T7GNҸR2> %$":B"挰9`#?j$,PpK`,l%5 ,|]aBPьqM"(!SR+ϸ$r0G8 u$jL#9$Vb$0mF !K6n ~TݲRբ Ivt$Ac& @BsùDu(.EYq-DPCMW )`{U!H3M:"JX1uŁ*]kВl"+(|ÈJI ҡr4p163<%%wy2cXI5V_<WE~_1h~9pR-SZiR=;Z'k҃3=E҃$K)Q$"cz*OKܠa@Hg/XJf8?L5vsBd}^o'Ma JeCPU4a9gW|p?FlJkќ$Fku\{ULp5][ز:fJm{E\6Ρ9qas̚s2]}Rk_ZЭkbecK.ݬXJrL6M ͳ&F).4Β[\r]A'܋w'R$|ǓfZ|fC@2x rܿ~<9Az<&c̪<BXafbuQL.ѷʣשbPqlVJpZsC+7wٌHd}cO3Ό5 cLXH<}N۬"Kk bjK .)I؄T4%5~Yf„J,0cxaJWZa#|0Q=:ǒNJCM҃ %*5hCv I p%q Xq IS!\(j16 Kk7*EB9oZ%#i;>G-B'> Daw3 2x$c>A\.U-~w1j8읶޽h]\?B+D59B`"N0> )[cVt1e܃?ku(ЌX X5ڌlX N \.avKc2k9 -S2(5ρX#! i]f Mm9ѧ9?i኶@R,gz#j uL7J^߻*7>b رXq/ޢh2&ifP΍yA÷/MuǾew<#7y%eD!cFL1d3Ǝyx~U/I*퉵;œH&|~Cnp6Ӡ&z_?O,T٠K3K^=ؤto2w3T`-ɤ*K><).Bt?2Wi5pi`T RsHFJ14c9Oi.U` ߥBǑx eK=ɛ8x{[T' :A"01:8&`|R(tHQ)}4݄*TI"fʰAhcR\dJ'S'̠X18DBZRşgd/5-*^{\~gQ j6Zpt$ޕ)&R/iE;2"VJ}&@j 54rDσ~_f{sŸ"m#ע[| ҷwۉ쿈 &eH"WGq}'բB0K/v>J6õ7F9f2Ce;yG1貜w4~wn3QYbNJ,8Mdx* O Mq7;E V) GHnbE繆mtgXPO/-r~UOZYCQ\v{~vNZ#Sfk3DlhCf;ӂAW^ N(Kgs94aW<TŻ8F_:360|)ؙg{6Wٗ`(}0c[qEb }k$E[M.)-D5ݿ:X]j8FY5U 8C.do]G\t}QR3lA!*Sfu! VZan8`a$X_& u-gE*g?V֋6/gK!qőIO 0HKS-Z`U+qR`J>J`)%T=es~ &Вh\5o2&aouxτ`9d͛V-J{4,k jIMdW)F.m!D.!C-De5p\꩗6zU 2e2E&@L DJUN(J-71BKkd@"g`@ jI v@XtS5 i^:MR',CIA6e1DU ڔY =h !ƫ(WU+ )V9I](`:˭F#E^8"EkեZ9 I18 i$ [X3/l IԼ$>P3ki")ɲe"yGי X=.i8&DŽ@66A1pR2@ZtZcj FS$`;4@W/#`E`!s &Hl#`|~{h )sSǎ <*9X;9@̌J[1m i t!aa[FA$i SEF5q NBc3/6 };F2@~~qÖT6#1M䃂%Be.O1-q)`9qA" oD8teXRP1yhyAeLVyLw3J ae+N?g ?{F&X%3YKQ#p"֚˴&` ]zKwHc 9 p7j˵vhK>Z}BE>:DbaRQnQ],ý .a϶w4VJbqOLFCQ nލca}H|5z2,@_! 2=}R| Ƣ!:ջӃe\K)wԝJ8WŋH6țbb?M_ pX-d_<䙆èrwMm=~%砨F8úTs._]ŗcWU|rU~9VNe\`qVPFދ` Q=IEjq0{  (RQ9ݷ6}&wPT)apP;$ԱJ;.=B9sD"+Q{yiPQ!Y N] uքͷvwa u6ck3rʹ eRvR+%~mJe R`RY  8LC ԡZͨz穀F`[J;~IܦL6_/JMbֆE[1 6)ܫܠ)rJ:.8Iz}yQMj <zzBS6&nWvBdz*9Ɂg6V補VZ3DTggނoh\ 筎3tW z o̴[}g5Ɲl +mVYzЮrkropA98kʵRp;a|+Tڦ(S 7[96)-ۥSvu4+ikHqYxRYφH;];C65=j3g~m z`4U )%2!C[|@ gƤNJB (I=` CW~cڛs[=jQq"-RFj,qYȁeOxrT:j(){&#l/O21֑= b, lngۿ WcX+Vy3^g1jܬF`|2u`Y*IZ_AݝO@^:ˤ>O$aE2\')$Qw"mguuFF^ U|r'kCN%whIIN2]ءncg!,o>A/_:Ol+꽭j쭖VipY &r7yE^dMpd"Zp(1Mf>2zjUMW+G'oQwdM&Po<Ư )ۛ 05l*Uf'l9gv6L&Ik-EGBat /wEɐUKrTnIeNk\KӜ+X>l)9жzpͿ2!kqN۩9XO[W+|௯h֖˽iA)tؓh 8hҹ=.ܞСb+vC Tw+ -'[+ ˕Gp)70o'6oƖr?K0j$,iLcG']Y(MXhbc=3%ƞ< V[LA=4.mXngoQ|7""X8F2{#A !=+M%~螽*P#t~V8 ctgY4j/m{ѫHtD$-YS5Vc2 PX߭Jx#R b.͗el\*?K j@*JPі 8HRl*=&$kV =Ba:%4N-k-k5;yL-017k`N5mN'vG@o4"쟛x5)m8aݰz Nnݻa5f)IO&twn=dBﲿHc#q-G%ݲH;^T㝱R ;}]3o&o`╍5/frR2PGޝ]>9pɲƏw=]RM VnG6 5"-)HJoa2](g{&K>{5#x:I -0 Im)0YuXh"TUtoe2?y"@1I1L4_L3Idu§31n2|p1ÿ̷>L+ 0L1=;7bHj{A.Q%A`|oe0\REAN*+D#48D*=\Qg/l/UzkܭuwK*SfJ9tV{Nj!}$V}G[{J6m !ʤvH''T%LqvoVBlÔH*[ af= 1. $MGMhF]eg``־gܙAodx` a[A09m:^LiVMX"d(om(~sA)g_EQ޾F;{kģ)Џxr!<̿[(. ,C/9kT>YJwB}qBQT~^tT@0X8\ff| XL 횮~6]5E ;r9M3 d> >e_y0DdF VbfhS'hƝ}f_•] qBHq뼻(_߭Ir^ +[+)f޽4^tZ"7xzX52m .cp^ݪ4̇ܧΌ-D 8"'ؙ^\Nn``nG?z&@2#x8$5SWEb?8dgɃbp-K;'y~n+ lpl2x^`>|x<~W|ߢe_@p37Xe ooO\w_&>ty=?|(_ J_??[!|5 Q/6 _ů\`<-}n Y6|W&+Fu<DZCNƳ.{֣^o *j ~'eBAW6bY8_Eo}ɲBW?'-JwM-hhޚ@8<s|uåt(>EgQƲ8{lBOfӿʇ~$P& Sʦ M]<- ?ihŌ`|%v,+m6XޙxИ[g{FWcM|@6 ^$38%$;ݯeZlJd HDEvWUW Ő@-|c4 t3qI7C'93Ttmv"N3뤻FLIPE9ϪH8JUX!]zGrVqtE^(0Ɛx9pj&m#$q<0n72cy-3F с 9sI{-/b5Vy֛M3Zbaq+@:/nB)FzD3>P2G<0//r'BuEowh>t9ƙ>Ex'@Z1mRiivBK7&R>.nuf.}Qi5ۢ|4;'@qnέOm}z}<0mTRi-KU20y> ;Z0Rڡo DGvjRtz k, l{b[cFݒ^yp%e=ʚi*+H3ʎwkk&dɔ45%5ߟQ孷Bw<*)'elsZJ/m /ƒ#* H%::VfO J'KqYP*o\ݤ'9bּU+n~?J-\c+n=]IEK]qw.'H82M-X|ΙGD'q@x]-?!wZD6Q:(Q_6Gk^{Ac(cs7JLCQEQ:^_6^zm,Yëxn듽ӌ/q}L vVNDp,eR2 0;%J=sC=Bt"j)vf 6ƝdvXvl8>` xSIH7D ıXn0Q.@o:H۸lk3)9碴19ؑCTTTZ+S͝!֧F/QNYrE_5t$EҔ,ӮVmX4ǁTT0psxp*auk1fc أȦJwS3oYVTCR۪B46JFT`jS}(CւZ ϓ%4$IlI-:h3 U8XY_ cD"M.>(!j~}.NyPWgyS_:4GY4bEd67]{y4y ^xҿ&:Lx/6ҟy.oz5_TT<4*<*~;.Jgɯ@ޏ!%33n\S( ۡΆf:Xs\ x5ٌ5&>|B|8.f )D8@q@5'Wӛl*,^?؏oLMRTU QYb&YR\Ɗ,e#+OQ?Ƞ~R 1@MH& !aez̈́CC!0X׀\'~X3E^g?U1 D}u&fD~ҝpSz:'&'`=-Q9Y$,'.f8X6K&Gԥ\7qj>`8q&:Qx.r'ahKG_ 5Nb;CNƝd0L1ﮮf ýׯWW <;?SCǐx{2f9aK!Mwމ+_зc<ڝy2><>M_[=o &{lTh.hRLqB4DQF:|V(VlM4*qPp>M‡ sYTi6pnZ=D#}Ty~{/ml3f I$vzԉGlh( O>b ul˄U:H-@j H+$>@& &FK0Pv@HX$PR eSZM /믦2.Uz7IO5_)jshյJO9a~:VIUJGtwPIY׺UR]o(,.Co0 FQ%oĦ`u~B_u>#Ohڹ:C9=ɀ]'`2IhMMz~u]X}k\׺xBrrJbDLYJ×`Ri&p0uhr@C_{C J e[xcu_iubX9 Q7ٙgț]I(z3[,꜖EWG?CTW_^t{jwQ*vl7/o8F!'sl A- N5B$tP-T]^C2h䲭u]u[dzsهY BTgGunEԟ߽ݵ4BD$&3B gyɴ`ҫ<wDo^33'3,b0 $)Z:uD!,8 jϜ3 ͨ^&g-3Š T'h2`:>ښ!ԧ;-S85Z"yV3!3xgtYɘ$r1%7s~"&DWRDWlgy*ʹp,4XCh9 m @۔k".52+yIՓ8to WL[ cٻFn$e/md_`fnns|I`tɱV9~Ŗd^6[`-bGW b7&| P64jeĆZm&$kh-LxKFqbrħ6e= +E!ϔ\3E1i6lq6 jzd[ Y#RrBdyu# J1XqL9.D-S;t~ ,< -J,0a(xLHއBYD.C;^/A( 2k2 ] #evioL2*UOѮUيq\<|~Mn=]?->zDޯl45+kkk15LmxX`+;$C}hJoVV & m qhMc.!1ǘsQuT1ͯacrIU9nQffENѶd %G2=w$WIKdzH]M.c,vF^MhS>_e ORlTd>WߌkXMNĴ~@:KO?`-gOXUJt#І}>}Ght{O4r$=g~nƠ=ϰMxWR}+o_~x'~?4,j&lL Hhy:iW&>֬_p3> 1 1y*9GВmO兀LmOSL!6t t4.~٥1 巕_Ui5x=NQ=xn=bSƪڕ I@iq^r ?! U˨hQ1 7` 5::-Ut|B='~͜yy?>Kg mk% I'$naLAS*hBbr_0BH2<8y[ H_ )H YCٯ7_l;a3jauQpV ڮs 60}n& N`ʎM()smДK'[*%x -]g4Q^Ե NYi^AFE)ma iƫ7̅օuچܑ̱Br;kpN< LP%SN웂P_+ACI?rs/4J3BSʬ^ 3ƀ;-qkU4yɍccejp͸ܭYKWTc23Ra:&Gj: D)̪c9 {E_~z\6vP_e?+G(y@ uaȠ(ɲT!Ld *8SECs! \ EB{Xp ׬(=ҲU82MI:CI849_M9[7򎃆\%.&{e.ZQ<=umo$YA6օMG8I2\o. ՝A=VA1ẮZ \O'ww8&ާs% j]xT3MҪ2 ZܗJ{߱AZ/Ɗ88)VM6W1atүJǿ<$m^/Ӏ ȴ[\^/T߹; mM*W@t^q^ CaIнl'>E8^Ax)9hrOSN8 Sҙ(lfyh ۗ-#ΙVuv 1Cgcpӵ>;.ֺ0U.w`WQk!(P5$Mb%WZMiJmyK-m>B;<R!rPc#PEϭ#|6}*m8Z rB#k/SH] )\/ܚ\jF2y,m9xF5K4!vx"`"?kjgpOZHiAwhB5(#-qp!U2fO8WTI;~O_9R]ݹ_m2)[|E:[r?lX\fzP;$LWnH NklޘFfΗih/<ׂeIbPʃ'[`d>ʀ@F)+&(ZD6e\SRpmZklvw[*Q[ߌ7տ78*'+w߾9OO.&'qCL=M \ޒOy6s)󽦛Ll6T9}0}2Jc|\Cs.,.Оt\Fֺ9Zn@+AI"k$ ߦfn mjS#@G^QhЧF֧F&InSc1`zئY+\cQpf$650z@#6Yg,oZ`у7Lv0JbF%IOyV1vZgtd"8@XaLnA f]5nOeROjzv f;wTX%m6,5ֿ`h!FZ.YAρika»yt )_Foc-M>nFХQ^_!2YMoN[M.[Ŷ}0Ad|A ʀbugu3\/Sъ.co}܀\]{iLZ38ZL-;] k{ܼ_cIcE uF(ryf,@ Sȶcט k-1ao]8R#HahQe|X ( bf¡y)pԤ" b#gR[o]O%mz*:wkTE[lE)uZPFW/dɄe\vk0;,AlC#(aL۪'T""4/!E#_!X[Jo\(IՕ=,,yFBv-E=RzB0%W=BL>ifRҲNP vi,{-z'H\ }d e=b2`jH=PK{(D0+K^{0%Xn(f_4nsʼnb*ן'ܳR {ܕ,&.+= ! J8sE -[nZabZx^>`'O43y< M7a_ \ ց]L6k:edb;Le &F3)eFɧrjyiَkWpTi4mg4].268yp~I?t*6OIlr?]&7!!_֖)m5}v1elq=;M`͒/Ζr_hnյdƕ1rZLpw_ОhUYŭ*]`3+Csr" b! t@ArWlp.~! bmCZ`$rc$f,HσRfJqNTAzMb{k+Š SlLm,BH1㈌q=DxnC:v9ʌV\{UU•BpW쨖bW=|ԾE܊WPX=.dnJrqU>z8~$%<,1>tlpM$~rB\ _kQ 1 -!v Jʶ:-ep*J[]xN1l'7Eܴɧo^OUBHRlU-sJevbޭ>)=9bm"=r|x hL tt6vBRg3 FR9Z<q!|R;6f /$}ڧkИ|󨞠; 4\=:K4Q@}9=ZϷ?r.(3Y)Od9Z}/^z{M7yy]LIM)aqRp$Vk(K6,82b)cbOO e1$[E6F]gFz"w _e*svAG$xڒ 5=P V)w{7s h=?~xsvQj> (>7!={"&gb g˅*oA"ۂ/~K_/KނQ ,+ wFi[#NP2ǭtƚʄS’b^ڂdAB-ޚq.Se?Rwcη?7e`7Stt8*r7s)r=q]TXtwSφ j Wzi.b"N]" ?>(x]w­mL ڣEJuyz[BkG0iꖫePy)»^o/Plj(eC%:uS 6۽w7Gwׯl;\ht0ӂ8?y1Sa/}jxfXy m9+ 0VQ%ZptIO%RXNQ m;ݨް^`Qj)J@+:q uy1Mg9Qw("rחgn|N}1]M~ŗQb9w;':{,FگW/2& Bp'z~8-o#O&g?{WƑJ9W uRs%qIܥTb ;>g eXu,@{z{ztЌ:s;PLjcFӛb\ \nş3۟~N;f>V*$"f>UA^/Ou~诋o䟿/_|1]kZ[7x5vݟ(zAZQ~o~:qb,?"Z~GLlm>4g>6~g3}.kx )ן7-,c@X/C@onn&!{0Kfݷq7˾ֈS 6WQ p7-(xzcFݬ{4u MoSi~2:fv^ Y:ETdp^n8I[".HǴ#u=^ͦzYr@ɅBP9bP=WZp倵fJ6JԊuWUnviD૑ڻSZ{fY.%dRz@`Ɣ~˗Ø#0 z H00x(`XX$&.$Xڀi2zUTF{K:8B qꇅ7wFU(YNN2+±-s[z"q 4piW1- D7LUX9# eg fiZKZ!L\-󹆊Dz+]W=4Ѱ!mq_yGev%0RcT#wgQֆJY; Ѳ.&ZXǔ'bʝ5IMMm>_,sE;([ĝ׃( '{5kZZp e08g:1^`CpC!A)xb%qjsT a7JUbmmn*X.yK矴KI cSvl)1fuy Ͳ+6ةX@]:/̒Lg1,J>yMpK]>}Ta*DS;TW/QUyqUlRCr:R 3"_[k"?+D'T"_YYn?{:MTd;oFwxe.Ӊ]tGy0'`Tٷk0ٳ,c7d>>.z͈V׫Wr+ZdžY^F ئKƙX :X!bmQr6us00h}`#肬}$usXֿa*/nT AaZij4a< tF6>K( 5>K1R6gq&a.Oi\oͮJxQ*Eރ;/-fb˟u*+w5BoTND`5?36܅d:}0+.`e~1~BTx3nv{ڦ>`3l7o3݂tm;/Ap~v-t_t;=~3yT)2s~J0>i'XL<ſ{ jJ]]et<$XlϚfJvi|@0Z#3`ERPjy F4Oz ":bxw 2T`X2;̀6Fehd$HـmNFB<6'XÅlhcy/ծYTJ@T =DtKWF)/k'ͼx1Rsш჋V"T„F=~7*WQWiuј$҈4CK(/8G<{RcG2bN0h]$?KF-wbp%Q[ Piv΍|p!a{i2B5S"Mf D0d]bU %n &pV`Z֪?U/x}[H,S߯D1ij-0nO'1tqa'=Wb%UŰV;cV Z컳A%!ljx6)0mBR(Df!UkƧ3XjBr9苤{J,Tt|ә·<(]!Jgtb4N;;?ߧYxow#G§U N> :/r㉁cNR V1͛<I|߯TnGȇ4}Zao|T&s&=bbGBr$SMmn&hcn' 撂*ZT!!_V)xڍ")ݪbPDtQFpCDSd}hVТڭ EL\nmj7E[U N1hnO[eBj:$+J2%y77üqcn'LڰiʄnuHW.d7/dS\S*bP(QqRqo1AnU -wABr[OHh !`5zwyx:rדnKq"/"ر_l#6ïoWUƼ >x QoJzwR,cЩ!1a.;2ÛOZ);a=KcaFZ:-?.\,_Ti\{5ـdó+Mr%:~Xj[|ḋpE|c>F]31ag pB2^gH"IGUcBVO/O?'ڏ~n dDBM"xO=pDu:?rtAa\ur;u˟ݗe:~Onj?-Q,}>˕`np+b$ 'DĀ"Hj$E֮|Z7|3%eS҆8,rQT,osv.XXS'Rj |җpR L)!SђrdX0J)!`W1yg 4X ;JX0*od`L@jfۇcEq%(IӼv&f%XvQʹN#GP,ш(^ (sApd<ڏ\>Uy_z<^,Supqr0s u$ we͍H(-QԵ11;3Se]i\ u@)$z_fH[Ɖj 𽉅בy}-(~/z7zfrD%aZ mΣIGE:Q^{κ--o#}\{3 4paSop)D 'r06 C8jSA 99Lz*KF<·1_&a˫K$SRdRIY@62c a)NP"˛eH(>qD7r we~8}ie<3' 0<3c>mTywO5לlSn!P*#Qn.ۺ)v!kVNm&@Tk1 xeTyjʔD~.(H+ Ҫ(UO8D,0Q:DdXx'YG&꿘 {+?VA< o9A2j{J Aprۋ[1\7[L!H(=d1Q3 f0e(E)'<ͰTPU:p"J1u戥eE/WuTȾu&NxD/*r3IR w62XYUYfyenP~5 |Ig-_3=h_T:[]ZQ mdx+t6.W:靌h$Z8}*'r ̫ cvF,+B88zK?p:n7[p?ɿd/ Lx6VZ}9P]DfDO?m'.cOIkKҒBbquVt٤/]ܡjSZL@(w1w+ Sҭ94h 2"H)F_E?p boF~` \.~:649(ސt.*&@%%=DU͟.fĹ^;+FMo߻rmh*AK߻"-] /i UK n//^^և R_Y{Gh7 03\n¸{߻}PhI ؙrY$%uz/I(~Hb>/CՏKc<`ug{q7ln]+Kh"\WW>BcK^UW!q^^/ܺp W~a>7+;󡬮厺7 @'>{XȸzT7IxTݤ1B"۬oCuƔRfl8[WI87PN,q1'8e dL\πq*})QGDoDz1Qڇ l.CL6;Yql)۬ްP[)#Y3/>g:T'1SVOcRb/S5b6ZLtxp79~Pm̀䛻i4<@Tklpvo"N]wd^˨;1p̡byaQWE[BHhu_-p/LwYR t*R?8wc h`YO81˄XaFS$F$ L֣fN,f AQ)BB<O3Z?{z=5ދSI0 d!8"(;sRN*ֿ ]I矍v+fdcgP9 D$ǫ6,6.A(=&"=}|eia>Z i,zP;2$97D$;y*tH |U]lN,ij>>AP =PVuNmz ~tIQ'B&W`7H`QeJtmҪ͐b}wP*\Z3Oiy23@yL7u1@-a[5'6_;tHD-Lp/x`}05D a wauFX0_ʘt! sҦ2.|sZ O־6^Lj_29^ Bb"}YOd>S>kؽ wWI-ϯ;T!k)ɣ(uJހq q@rI )[6d{ _a8Q"$8kNX?k1]#3s:E{.ޞ‰:xWwwf3LJGn}|AtS{h(KK;?};9" 'tr漥bɹwC;@BObՔq9"=ps|4/RFXiؼ .=~B(d{rh3DJoL+( 0GrEJ!}]>5wv.T8~βǍ{?uLd,De],ŏ\zDnYfΏS=rB>2sGyjOWTHYV 2_-U-Qh'"A0|@;_87<Ő" 3mQr@ 69wC&X1Dք' ,Yh+L|u݀}}?Ϥ;=w?a9`}ҕP\d:UjO KrAUyFqx5s';e®Y*&7Qʩ(kdr09?c "(bc,[GϾFXiڏH3h=$ rtqs.E[bJ .߫{4ZN$\7>ʨZJw w" 0.+>х[oh}y7#fKZTeȠ!5c\p (]*ܖŸmRp!z󢌦Zߚ=K0JS{+p/KeMzb3ٱrm%_#r8)ʈd{,!4UƥX=/@*v# `{et2ցU= c|¦Cj8(+=`hw*^ဝ蹂p'Bսvw=X"aV Do|6l퀗 ~i<|X%|:zA^%9nV_Bl}L{LPYba"0k{ Kk:{{悟RO҇Y/ &`^#oQ2}c}=Hb\`vɼޒD}Hu]i${#D@mb˱DQ+>]c*XtG%@V^79_vGNZ.QiP.c1JCM= \3AB~>Ok7PPW6+v-~2J$\{6sndqkz|jm-4_=_s:9lW{Yr+MBTr69<jd%Tr Ci@"&!}-^,ߪϫ$o嵡C)w,pOk8\+San aZq9J]@xn&z`L*ZR:WV2ww'iȑW1,2~p]Q\ ;-JM뻶pHrUۧFh(NΖ}*Uoon}gQ>}^#d.7V+כ\Dz-Dw hZ>|(ɣ騌̕(PFHJ-lVj*3wL߬qt߽}U/?x1H& đ3 i@ JRӛLD?Bgk!݄R/:lԾ՚Ro!*kD6mW NoU-|EҲ#\ :usgCQ=A&ښ's+0'-o]0cqC"x4D֣YT-Sx A>\v rx@Bz0-r)_a"uaFRu=bLz2ah}ALPdWi/T):{D?A~ NE\(_-Wi[3HUwℲ {91n\ 뫒B|S)b< ;Gzܯayq߃ȹad6^S9Ut}9u6CU|~")%pn-)}Uc@2C7es,~1w{ .?kX@| :g]'&)OٻdW 62<Ilbv84ͱ`V; }ID,RHQ`8c ŮP,]ί2 -X1g'm+B9bߗ݆ i }7 Ï/`o"g_䗤\z IOۖjuߡỲ'X񲫅n& ^;Y a1LٌǧEMFQ:/"ϏaDg_th^&|;$fx.F^*E7>y//c*!Fp@'4}ɫK7y%BDå @'wo_u7d)4}ׁ0 to{C"u>zwF1LJZ};R 1WJj;g_HKw.0Nkn/'U]JUƍ6B+d5,D akKøD\#yq[4ͨa\oCvhm b*>rZIoZ_^"_OȶZyuJjb%bU "wy>:AvXO{ئw_ɂկ9W62bd|[xkU$Df}Ej:d}2OL*Bw15ɳ/c㲺RZc֐qXj lQ(8 ~,TSzh򘵒Oxx3ca)!r[xeس@` v[ᓵ 8F LCdG:iql,ZjEQ7u7XUY7n4'~U Z(dR6!FIx,D` dtT#|80N΂ΕsTVLżй>85*R庒g~ʃ ?hG{'36C_C ~vpڤ'n:]PFa*TG:7O,D/ٌtap0uӴF Ҳ}}﹘ ж5h>Un0o r*q9i4TTDʲ'Q8>Iz33ݎJcZRE2WPBU9@v2 imG?;XIﳪfs!+q8kRAk<:Gۗr@PJO6ַ#4(4@S.n`OQd\Qʸzso֒\j,ȞR䌳TS+mmZKnN!4LDR!DTk (m 1 a?Ao Mz󫥖7b,NhSRkN?̝Ѹo knL%&dP1i%h8,77x9 R'ŧP]u2tC]*Z͊Cm t.?˽BqRhʼnu{3.?vux7!/]NryTsC]*`ll6lub'BtP (Y6a@_61BL>aq{ƸX$n;`ik/1vIZIа\r0i8dFY u2NS(zh|r懺X@5:);e #g̋I&((\k+u,? jġU|8 39-w]BohaVypAv݇jBAi ׋D.vM5%v- 5y ~ɵHa1&=y-iOB n)ȻՆuTM[kvX IDƸhWn]^`]:p<u{A&Y]̕mߤ}$߽Cݗi^oPHhr1{v:Us0#d z{}A?fso!ʃׄCL'WHܜ9MZPxh2JaӱwOfrE;yb t~v1=ÔAx߹-wxOzQEJͽ}ۉ1fIƌ)y̥R+25ED eUX'T#tijK* 5u[*ml1[ZSv6RJ=s\3n8B ̶]QR,q;05>. qMȫ (Kss 3 +B։@`j<+ y{nk&xWε}w঎H,eiפ!UxT?1;ͼ1???qhW|ɤJ.vo*dƄ90^⦇KNKH3γ>X5u:3}VW=/*7Ku_ȲIt!s#vSo}ηAz7ӟÓ5h;uמΓCmf* R AYZ߽eDk&e5Il Ѭawvo<"|Fʞ7{}T~mԓh=~SniQkjq/ uY/x3!8漆wG+`bh7ɼ&& eU @I|zk5[fFntc.nZJj,(BRBj.;H_*;XUWiEWk{ďYurZnZxbgY (̶K1 2_ru hW AܻWtOkz+ {1@Y|2&)J""vcCjMb@& z\btL9ГE)KH9w 1(`ӟm.\B$ח}X!L x<̝B`-of5񟬕_X(u+ B8pT.6x7 c2h΃|? 2<Ѡ@cFypBp@N3 N LPEuR;75&#_'OaQQp.lHF%Vc(#sD}K:ݍl)r(ILhNB&V<$qk퓲1^I3^d'e6A\f^1: #a∆O_9 Iܐ¢)M.xr d{ "nbDH(% g)kFLCMrT](͵GR^H`aWjD!bJQ;,T$G5 82vTELS{y=,M=I laqV84W+yn{)6ƪC{Y^$Eb[Rk) "a6VptwP bBnϥA'VӕK2pO*sWEC<`-(⢁|T"(F\*NCY_;_}+Ivis5;͖+ր׸CXl,xHHq$B988Scml-ѱVZ# ҡrƾEpA9Z*KJ+'IT1aLc%|" 8)h%0+ 4:GMv%ysSDFq'h3Hҫ(kb!5L%nӪT.fDb|ijFTc:+KQfV+FgO`kh?$]V򣘐TW80n6߷MfeKihi3Q^QA+YbCIhDXPD%dgz9\tݱJSƤrXPp "I)v_B 7eqF0 1! 3Ȇ XJP"Ғ`0f9 ل Zd/K-%QhLZjm"̎E(&K#!6 M8.PaKڄ`%+9ʾ$Z9. J 0vwRX-jTrTLPRjM Uf5Fi+{fќPlD PAAwlorSЄ[oɾv.Zk91ڜfX.){Yj ,-$=c5ʝkX][?zbQ3P\ zx_j/ ,9>\'P-h:VlLz2B ^pKt몼_J!t5l{4'j jVn>׺>nоe\%IՌv0\)k{x@#{"_RH{Ös3–BhefVÓƼ5/* ݇ksMWMӅ[PsfHӭ Fy̿.z{(E !ORailB i3iR8ӹF%B v ( Qn#D2 N+U&4B" $ӄKedQnTk,y%>fN8r8QKK#a*Dp*PfB ؂11#&t,w' |#Ѕ)Fb'I f'hxP$QH`1pKE3LGK%O]EY^\a 8>ua zD4$Fk: +6DG\FH#Xc㬕 ,Ns03Fuq% cG^W AܟL-'*Z%3lE&ZQIcٻƍ$Wzsx#=l5l;|z.Xu I۞HIDjvK,U_fQEm"]Ƭ%tnxqE3 JK]'eoɍf)_:X(EeQjaPvși>KUSrK}N3|i)OضH\k@yÙ]6oL#&5lY[K}6kC׹NeGϡCQs/bY@INVtgfEL $iG)O}IJ^6sH4{ $iHHҲcj=> \dRaYoqoя"F=~9w*ZpD*ڻyh؇yػ}Nþs&S0y5'},A6q[[>lE(j,g=*2=tvuchsīܑ]An}6<_:#G_uK{jPBzp*VX"ѥ^;- f#+m5v*ǣr^n{&YN@؈ }qQ?@iqs셗N]I]7}!"+$/iXE ޒSVun|?]/[QC<__:ĈUJx>uҫy358-z&ۖwNꁤQە%QCHnŢѯVRLq0A2"6]yo]u#8ӰoR4컒`G$e""$|h{"`8w%F'~86 CPD9&1۪=&C헃u֞ l (6|~r8`N3g7HSިH4нŀBM't/T=ҍ֢-'F n(ʩ}!@SrBL,IQ2 . -OPHUF9F<5a&Gxdp Dkf8Clo'G=sUpʸh$NKC^*P&:(ޒ?&6C3D<Ց.ه$2bYX#zo.?e+/6^C4Q+DC(ģw!ξ9GV|x{=vyru\R9[.o޾yC:&(>ޜE{<,ۋ^ o|zRbM1HӤxKa5J$D}VKAHCͰ7КW9گwVeOwubêz,ܮll_߼{ &*~N}~QX, !LÜ/ 3񇯎f|\oo'G)-gm~.[Nnp|5Mscǜ0d÷g'_ǜq h[%m' D-Spc%3Pbu3j]yWt7)od,GP\7Fo+| Ur.ТR,GyˣT LɨB4j| c)JyP2{ܫ16Rܳ.X4nw›F% )YjۑA420N+U0S—RX3^z]hRiтy–1sDnFkaR)q~jT,uD'֣hpSV9[qw皂h5/^bHD#F̨ܸ˸$BLOC=۵&ػGF14fʯ' 5.;h!'=60vgJ J=՜ N$&KnX#Z헳W@G[+ 5jvëحWl}8,qL;>gJOHw|`U1?v)WYtz% RD&B1uY]s%~eHn/ rf-=)_RhӋlyWWzϾCV*JG: ]:̌ L^eLيrcz" l/=b$vQ6fa(+P$rwH*vKRA ѢF1&gs 3SrV*ΗN}=}:ktϵR5Ŭ__1*?%h~㫿TYd?˿UTe%ICCA+$iM{j=)P Q@`AL^*kւ4@_Yk׺/oynE\1048z)9,dEU0eu<8[R QH5kMwDRa6;xu9t\am84nӆ1]@p/*RN-︃< ӆA^U\GoWW`\2W&Ŷl5+q<Ӊlۋ۟ϯVs_>82@ ly{չkٜ^i9i={FapyFx)k@6<ZF(cjkA=vuzG{odrF3w8յM{; 9ռc\7)jCx\^tnN[1e9vyTkdC\)N*U.Vg<3h>os)(ݲhYGlQp猯JMAD~Vy%JBE0FS}U<zWDѯ?N`t):&3)Fq3(2HX`!o2% RZƄBpj)5ku5\7ޙim` k3.wJۛEr2ĜB|,SLMޒ{.ݤt u|G+z9t&{=pg΢9<TqB$[.)6m^ ''ݲ*ϜE t1|-=5 Ww)h I @Հ.SRrnOJv0R>Q&bJ& NQ8ha'*& O7:]% -А쌸V ,L68FRaP5t% zH*A֎Z`Ppۇhi_P`P8CL | vazvlZuE)f"tTZ|tjU_S+i4e%Fv?BLN\1~b2E{]]n(HQqwnS zgԛ!Dϓz5( TYl[4(l/ K7fS}?׷sOpׅ'Ӊ_SWyH|<;{(P[ՍY7ުr.t9NSBmrk@=|9Z9E)jwnzqr]CZjp85/YH} Jfsk㙳C%w(h"NRsv D[0tvYDlÞ6 0d>fJQKYp4 6K~2`hI| ,h9rkT"8dR,KI. D#dCK !λ_ O*,GH擣||sK-V(XД%8AYAK##*%(MegHB)f!*F na2Pi Kob DYcD$!EQLHstD"Ӕy2C=-QaSQHК^1kė#no>\X4s9%UJeBj=znW=;|@-SOh5SPO黻Ϗ#B>2$/oNa h_g=ioFEЗnv݇^`I"$eF]5H;}E6%QVIEFUw;~0t8'?ޏǣ!Bggf:~1;3 xs 6-XOϿa\duH33wCOhlX1=;*8=ѹXnm `As]04l*$@.g ={HduVܻVQ~@Ho'wWQrbH-S'x39էOJ<:];x)[>oKZi[>UKZ*Mlk[Kv(tv9cfemo}jSaX4b+Mi*(}F *L}( V3p@4x*)qx*)UE]D 46o)iҽI;n,iНL<BEGXKdaKK~YiOѢIi$9i$5WiiˣKm\~t5(LcSݷ` Di~}G@ շ  o'_T5P(֌:i#y{c~C{mP Z䴣Pf-?&Q4O4k?lK Zzrܢ0 ?]ǜy<M.η66oc+/ 6 8A9r ZZ/[{O/Nqr;y6/hèEL)&>N3TZ_2\ҦUv%Bdٻ%8vՂh+ko\ahh?~^O鿎Նfp TkF kG+tBh,%|P9$*C3X^73Zןb G8P+ktp(^2#h4ND@k[H; {;O$z[S; Ѐmk7 ꎽ豭(2iC0IT)5Z 2t#⫌K24+jvp\ʢ \J<<ڂ Ր; |_kbk1OeӠ*(CE]￾j34$*,PL#㷅8lE>JE(mQ.ҶX<"]J-8HI=)n 8KNTE+FX#mrO g⼬jֲbCleX[L]U's_77 HXGyl>N*DZhQXi͑k/29\TV#6?ϣه[Q X|{ G}I:gpђ$Ŷٜ%"Lk NPF!P9Bke+å18HCXLgYt6kF%~5T[~53KwR[H~ f ]/zo642ھ70bzra8FHw!ŵr>r0"q ~ΆL2lk32{KŔƹew\IxMBMmp;gyJ &&Sa̶p3]^۠/cN`-ܜѝK@8DӖG?c6_jFR!xt?`\II~%njwJQGj(Z >6 M/8jdsrܿA|lxQ|\ h9S`_M˸^]^;N2lEra4'WYlWFSx{݃+H3o AK\0kT aEΠGZmAo}S{d1&-n{7w4 脉7LZ8ho]'c6z6Y#bn$OiϴMΧmPrֈFdN;lF?M7:.IiTO\n~.Ó3͠UKD ?P=B/\{mC[8w TRWPLpei h.%`F1-6lE9)/Z/~޳ HPY󪉢1f|lÎIA,γGyǁAl4zM)rdBST(^;Ő-"咴ChT-^gvsС5j-32KGbzOӝ Kw,CKq zhF ܱKu{#qGX'A'Xr)iυ~2 lWQRsnx\ē"$KM,p it"d$%H?BʴckOn&/ߛ֠ZiˑrƯ0>&v `V")O.EIJHc <+% m9v@Anlf Il!cQ"'SHx~2G-ˢ,f!Gst D M$$EZnK)=ThIU6[Zllfjabv}EDq0QE n'wWZ<8;IC39է 8MYtaDƙ%| mKZ>m,K x:aM\s +-YZ덃@di4*c\b,;8ΊZk,+0`B D A}!Z(e԰Qk!T~!=—D x$d&hU+#Sdk2g(i끵U;wz:g|U>)4P_S3$+ bB.Pq+aLԇ7#罺 lV1zHlob8aw>.a}s9AHUk|Q`NL'ITlE&#o(:xυM)ʨ=b"v S(/d.ҷWCX)I:1d >e熞p(E-qJޏǣ˦Sp%ӃoW' )V6tBin^FSr`A"'?R$D)חW ?$ۦ5rJG4#%62l#w;FG jHj$n#Zq+DzܙmZB^e;v|3!]Ji%   b x.LmLh- }lT*M@(@@SIcJ`9>(J F+m(MP2`KGQ4 ZkV g`Z.,>%0UzϘ,`Cuo DgUƑ 4J8@աs6PpCTn6`1BwPj%c֋~XG+D ]!+QЯ K"L@XQFq۰&4ΕaZ&&fRm֢;3O/CV"6Ըd+ fU,uY8E҃5-t<ZQDNfn$6i?z@7Vdݷq:M aru9~d=b$'k`ϸmZЫz a'';(TnrkÿտC9ɧU1pQ%Gl~!/[ f66p›p}{SlO nӻH:y|$oyNG^ÿë0@>SR x0sw'842gueao $ R ' #$e/&" LCQ!K. 0-f25i\ytIZP3~Fޝ@9 ) &9T6=E H-j|HLX)tזv}}̯wn07_}~~CUJ7mDMwVkY#,̪&^:(>wJ/th%5u[Ƹmr^-muo4MFHjMBdlI!Ɗrjm  ^[t%N9I\>O3ǟ62f B䕪sGrò8Q8zmÑTUPÁgd;" j'x- hu qR&vN({-goEvS)q`W/CeQo,e3|k)|,e\)^n/*'}^_KrC(}˩"ڢ=;e}; :s`w`$J&`j$f`i8Ol(:D8Jj.Y 8hqo53;{pU[."e/| vs J$1?]VQݖae]~Z:dZV:i=mr˄;m*)*,yN2R a\X&dBoz~q|A I!Jtw.'΅CyPW˧ޟ+;Op~59n"-?[#S6)gF+muUi@e}).fьr V (˱B l 'ȅ Q"'AVmcF}nV'Y=M?4NZH) #aJKeO޸hYZ/IʜD,2Uٖ< eZCpnQ5vlhdfHH8XAے?Cr 练؜<_Qn{˩m1B} niRkvgCpWuqƐ6#umRxQ̵I"]V;p;/AvdZ9 $-cnQ7AۣnWqim柏w_ 3E5bFǮB Jmc/QCèMۇ_%h$xEWIK0iR%t€ ֮JSB鶌Qַ'6hu9LL*͛10V!w&m~[Ʈ? ўf7˓zOkV)a^`6\BC`d-hlE$tp-gƊ~sC|έ"G 8ָB:TCf¸!k!tmq2V*5Ilq@M1gMJ;%4m1#@I KQB<+w[3md$,}>~= ВB(vWk1qF:g sZvϔ̚8:F1kQ^4)(^6F%hܱ*.@3H4H:&)g_8ΗW7i6 UPDF^d eH`e#(ETt4*;3%Ziuo>-fA6%|?$"y( Y~w6ZkJD.닷S:pĔk4<3! }VbքQ w]?u.ΘmF8RF-}Q9'%&2JJz_ W |_g=0I+9ldJ@ʒR*66 C@0CM+:"aowKN5pHl eʧQ>mC0:-NT PrxF]O!؝ ֙nyO?:$tNl3^_ qpt,1F1@yQI/䠎"] }|q7FI"tow+Y% _*<Զ QX&k<;Dջtshc:j)@:$+o:DvcaHz]#]䍦cy9^"o^"o =E~?J4au͇4]0glo9C-;u(آ~6ԍT܎މ\h4kbijZ쓎qO溲2%n rtY =zw!hߕ5޻s~f^Cޠc50Y+6\ېn}yA *u`V3޿6u/n tٲ ^,D<{]zu4v>9g'ne*ѻ-g2~V,>պd<(nMD{uzݍtaR)3x>6dwva|y{b"݈SMsXwT܌ (i.W6}^6]ٺS;o6krl:)&ݫ 0+mC =gZWC'Љb.u P,woXd"TOo{n5ˎCݸAw|4mTl'P@woT(3gevNĀma=aS϶N&-u,5qlgUI[f|벅}J0q=z>|Tq_Zw[k{h~;z=|'FplX>6pxYv2N$N Ǩn=t)EϼϒcS8*ٮ&?iZ]Ĝ 5N (ل+,&==YqU޺qє],/n{>W$٫JvJsBITڸտRIHbZ%D AT9:DgAeFr`S) HRrHHu( DlH!҉T:qvJ "m{rx:Ivӵ oѪ<>.?U?_6ό1h5ّ)*C =fDZ :,A'34N" "LG#UW"3ǪIʝ!C)#Ƕ9Ȟ/N)-m7hvZ4B'CxF <Z 4 ̾ÇIG1zF!MQFoiFaP5v i ңV/,h=yJ3S[g:9z;x']Vtɬߚ씊M# 7 ,Gxe4E\3Z)CI!T!WF K?+`o@!h?~D!ks~]Ϙls5>HKg_SҔues'L ҄ ^`Z3Q>p͐g_szQM9:Κ n :sÉd2নlcKUI䎂/@/Q{ѹf%"0$B$BJsH1wn2;6lwt0e7roǯ!qԒ[Nc[2q2v&!o @?A}7G$nYa=v%MiC Ep\ւDy} +qN/vf-J.c|)0 ZFeuz} 8;}tV R3ZlmI,cd6C>U;S\]Z{"qw޷z|UE&y\M*2%p6vqo~h;+k,hl\ł0J^.m%&/"Wk@J 1z뼂I8 Ob6LR@L2[(NQWp`>xC+(J 9KP q6 .΁/l@1v1d÷L"JR-#?B  $aO ?l8" ìքP$Y2Ac YSm_p)06H#lIJ‚ʑ3T!C{[*(aMHhB z11|Q@Θ.G|c 1ymR+!P8~scE0\p9B@} >%0I ƃ4`D"ZB b,5Z"@n&9\lmuy׏f讙163 z_~7~d)P{&Q_v2 })CNExJл vKŠ 뤎qv..b jM +j:8Y4J}wa-*:!ڭH*jn[r,S=~OnGANz1:c:g*uizڭ9qM)zwDI:nTa1nc"Mi{n}[r,S>ǷXl-*:!ڭmiƚ[znEV8&ǽnTBbP0&qY}Gi{[rU~U߭9qM)z_3ݤEEH*X'q[wu`0"is7U[ r,SXpGjT NhEc ܚvK?YWQ!'΢IN9ǜs5HT/9ǜs̵Jrx9f9c' "OW8;|52W?#L?c4&ϳI| /d4,f>޼xnAI} NN5tfjH[gb|֬ |#xR>(L( p%sdl4` #n)i73BJGI !Lj~"(g$b,Xn:hB@\*OCQYpY0r" Bː9i |# \piZ{0 8!d ,G&H D >-q#{ 3[aB#Ew /[Kiс Q,cCAC %BXc] j ; LU٠‚s@1GT:I8H'A#o!f:.?/umY\yͬ>>|x0Zw0~Wrc;Y e`z43湞PPI)''l+x񞆐`?v'\{"rț+ozs=/?3>8rŚM~Z ]l9R@o'EDH,!8l;*o8ێ`5BU{U _._C9+\xYA <.NXj !!$RBC E1f`8h7Xmb}!$*%y-C|3A$"֬taB ş7V\a@S&}'QG_Zdwڛ+3r0Wv/uk[}znf5ώB kzj8\~\z`~/~|>D4Dߌ.{ܶ //CR㊏U5H{NI58¦;{8DcV4lؤo`ʙP)e? fx|eFs)H ]16ւQD~4}惝ꕏ^w0!o)bR֤1LU[:WNC0: +^C'`~!F/஁˱:a:2E`B X6u~)8cS|b$j ɋ=\-VBtG jg w=QAbEO !vd/+cwI}e*DeAތmBӾvПk|+${(._T vP6 `N~19W~P~P~P~Ud0 Kvj,J99DrTDEc3dk6 wn;͜=W5W]u=vp/Nŭ3^h0 /}5]d9i5"|ӚwJ } 7S@h|،nMg88+c7 |g#zx%1Z3LeqrXmJD6*c=1\!m4NQ*u4Rpf<hGR7a&6oۃn gυDS9ӀƤmsu;Z6Ƥ`-ئ®M9-ѭ7'`)AZs170u X* /; &K8r;JyM(Qh5j]a A$$&;uWM^HOȨO/Th.D\w&ϫqK}K:hQqb8f@/濧V4p18G=kԙ~fVz@k'7L|!XI'|v2X6؋Fa=e\J|}LU@ vՇp`O8)*{(ݬSqӹp Uqv U k ǛU o y3f˰&FFg'sxO/#ޞڂ+Ւy~x[S$ =ſ{l4^!>}JG2@gs* :D-$DhD@7<Ǯ-Qe[ÂAӹ~XM`7uoF<젦16mI9|eatevCF0ޏlw3>Iru=n>~Q?ՆK ʓykcC k'>VKsG%\ZI#MN<=0V" M$ұ, 8IbI;*D Zu}naOEvu7fqaBU~Z<=Vֈ&gcIyB>V!VjV u4AmNZ# wzd֒U[j%!Nx3t63?]x)zr_?権xĘZ%cƌ5v$NlX$BF}})XqK:ւ`?(iKۤ.9>v0FKM܆X٨um)cEphE{Pj_4\#LA;8NYKS%}!,j%{l0@\jЉ,.q:>OW\v`6 \€ .KGBLU*K^ ^3LؑEm ;k.sQs$SudHP3(t}c1&bF%ԸE~uep{_:oFYv9Mp䄻l ..>2Bʞ(˝CԟpkePaFTLji~z0 9?SQ RkR#B8o Bk%/>8PԖ\Ƀm-K]B`z@E=w\}aբpG:au<ǁ3r%3Lٱw^B|W%ZQ[B#+{ [:p j!(W=,h ,>;S'z ۴f=0TI+]*E?1JPeϟo¯Pb\΁tӟiMo"?JJJJԯ*Qـ ƱOW+#u$юYALN(&*MqJ(@x"/w9{^Q!J#TW6jQ/ 5SgӈS , $7лϼ¢Av \d VGvhr:[|RL8!)a k(CzBBxTD*AһT{$Vt^heve֙jqP hH$u "h*PjRpRAҔ+DIB}Ja2%!l 8(H~n*cy`ݟ(Y/|=SȔ\eם΢-ix4Ȃ06 hMD֛ڃ ύ#gϴ|dhD}nϗ1ϩMKB& 0CWJ)\7\޲@,o 8?w6ְNBrQIʺqWkm*%ע0fGJ0'߳l=pFzF5NC>A#Ơi)N?azJz}|JÝQJDiGꦝ.C'gPLln~&釻IջIM LijK3(Cu a8r:8I 8FDݴFݴY]. NC`V!)D$` L]=8+0r#vߏGdӏ 6?(;zЁ'u{ x'.l⃟0;n:wYлDv{Jjuu0 L >DOˆGV=<ͦK]7aZ.ۜµ3maxVbr3i4B+?!r?l 94R3ʉ$3|BlTP(5 I F!W="Pa<%qZn+1`՟xLi7:c S/w)sR-! '9iR oCIunI"m6Se;\2@߀T5X"+3Nc4܀BAIN&`ClS\B49njVz9 wOi\Fz9Zx9A ^XHZ>Wiw/ۄhɹb< *YB^2r8BlpsrޣFQϣS ,u/gBb? Kc/ЯpxF## l 9O)zZ mXM(͆[+qp)Ԁֽn\H&̺?TVCAu2f* 9QҔ?bju8;Tfn)XjOۮ7 Ռ‡S@}z=mrb] # d7R j!)tC1a@lܸ .ecٛCiH-MWE^`gV%h 9MHN3I;lh+=!&@p$˭\@|:Vh7F9:{00>1V-MPkڠ6-]fn;5A .5lN4`}8EE;u5lnS0NL;hu.t% 3횑*/ϻG*i:2-a`@}#J*mZ'߽) @? 5Z+' ,GPRAWM#{n&w*|#Ҙ>xmԢ=΢$%AoWmqSO0~4iE+ݤbqݧ1قF]l"k%ٱ-ߊ1לy|默28 ږufW3knlc~r.BQk \ޠ}:,ո'gs{.wr %vH^>SENef[~lmvA>\Ojo>];2(I|Ш+#!V&xY\:"si%[?R o9̅}6t]ݎL>' |%Y/xUφ<1v⦌)ᴀV[>Vh(œ/WN];}rOv̽'9ԋB,xw_ HDsΠdbѱ*x+᭬!b\ s[I  VBriSJOmȴpQ0Bppz5I*_ 4+:MP4x#cٽg*s~sav˳&.[DJe 6Dt*yDZT-䎞%CRk!HЮGvKZ8QkhoR^3]ʏjΡwQ/:1,Hy2to5{4yo$ԯmP{~"ķ!7l=E?V[C7<:{.㞋炞܈ccQĕy~[:Oia(|3jm 4,&y*Z@vdg(rv߭Ы%4<E^dc|rV#&HӔ3AѤ h8ч2K ͑ pS jTe0WT[U[zbKvf/v-U(QwKg6A Ek~p7W8-q_7C@JӜtX9WIa57=9igѧ@$NߐBPB$ϡ)a)'k\ ~MCb~Ѝv9JiXNh+^ѳzɃfLJK90%R榜v)8LRB+=h9P+f.#9J5n4H jl$‡HJnŒWDZdA:*x  uHظfaۢWdmaBJ{cp-ځɃЄGbP'嗠L)I/MdV8Yaa[?Ɔ`_K}K0QRAjM/1! I^j[a(Xmgév.rA si 4RȝF ] `_lv_d<_ӮZ67Wvak g =9۰#躿 z+t{UJ T]]`:t2G$=!Km#U{5}Knmu^&; (%yB} =wn&!M8ʨpd.ܭ ~w~Mߺ'hܣ[,_5Γ h/=M_3 ~g]`5BZ2S[ϮiWj|'h> 6q!T~JC (GD(2 n3fPj<}Ke~,avar0yez<j[ayJϕ9p!vuL3kҊqbxp*4r2"A_iRk45NMp/?̱+v[-J+X?"uR )B/蓊w~=i߬+Oʳ/Iu N[JP0=d,m&3ɔk|1'+d}+pq ʫXɁ8SvMs&L3F \ݺoC,;_*I&;4wQsZ9X%%$E qg0aMWe6=`Oonj?*pԧE,*~y aGuzR<ۙ[Yۓ쇅9<)g%'.bqTv2axrBbOց=JBu.҇$͐'+k&2ҥHS 1QGI+չ V <9sVU8'5PaB?gvp_0T;S^jìfsWoTN+ Y h~\ĩ%d7.Ⱥ"뺋GWww<9tHV(+".'&CSy˽ĠmU޵q$З=tw[?!؇deϲd'N4VgmlQÞ_U׫ |~rN8m~+F]_t>Z47'fQӢel6/YS Ka* *,>Zjd\JLj'`ƍOXuHWhLJ:BUs[m1!FߋlO'&f -qN,DK'`Bpq]Ҿ>w2GC6e4rKL9!\^nSvAټ /s04y-Md\JM?*EC4RR9z$DQZ jk= (E6s5ۿzRN'+-ITQD .P̺BQ&$~B)O\̠6:a).*RIQ{Z駅TnraN 5;WՒzXH%.x.qA^Ǽl;ԹDL7iA(V}6ͫ/j6C5;zj.5K[pdh/:j/yau&SY=:ߗڲ@]~\z5Vz\D,?nx[W JD;h=t]9c{n;6j>$+N2Ukr\=u&kǕuĠ0v|G+Q.i-z\D2UxË<~W9雫h|n/O&eZxgC/[w6D x; ,lXxS9$Sv:[(9JiJLs J([Vv6KUс༮j yϝLR7mH'a&Ah,ozu" ׷lrka; @';Wڹt!VTWv4b"T^5m(cqw,Wهb[ ؖnIeTG}K'zUW>fJ5HW!+D,\Ѵ=~ZA9ǽWʼ 8b/]P?G *G%BhM.B6dp̓=z,\9Y߷8ϬSQ cuilr+#/\D6U%#n/gS)*d|/qa;xucC=WQY̦~kVL-½1π%aԣ|[T Я`v,Xcrs֐F9ƃw<̓h9B'՞⑯?XfgPpD  U@{E*j;P >GqĮ>%a vtbxZ1B `z, LƟ@+}ߓaѰ$w0#Sd)hEt0MO[M5ӑkCrf]u(%x(%UFpYӉkbOfNrOch?\åOs)g$_.hFo}M[A'WrC: j#ŭ٦sa\#OUF[*_>*哛{ހT͟:a>8i$y,Y&wsnebn=Veyټd0wqk#`g\:I`g+݇DL@u&(JUa'O٥&肘wNY D)qn>Nü1%臘f*)|"GZ/DEWCL5TdfV'k'FT** kC$k t5}X33)4|sGO:Hj]lCѵ{-w\Nu\ k.q.6qnR[oK'isP KX%Em H)hbJz5=*BvF%1{.3Š")tCPѷ U0K GC"X!' THHJ"JȾ1 *f%J0SD@qS:$N-ъܳ\yAv+.@f%m-h{_7AA $l)iPDb@ *N@c͘8ΤP!{TL& ԓs*nF6.r^ݱ- WWyoӣe,*V7Q^BEը~^Z]57d?1oK>]Mgw !Ɵ/t\r^;]fL&ڐD! s-F{xd@.>C(šZ|+CS>`G>qssɨUGzM>>8vP[1"k/k~6;yEA< g$qdQ3tDgc7?鄪YJQ9፶Y hP{Mea W!vڰ;ȏX,Ic6רe e3:&H٫1 H`t 2v%$ E.pQ`TSPvCvJ#eRG#ĬHN>kcZP P1O&/IHf/E!g9W xv^>&W ^&0qjeU@;#Ųpeh1Ċw=iQ2qi)uBASb` U4:,D8 %A :dBGTAwR/#P!ED5Sem2]P.r:+]70eږ쵙BQk>)MֈN$_.4fkO^ EK6DktC6 j3>JI!e6&hb+Jsʄ-vӘ5'&Lp!f:`\>*o9u1! !2#C=g@o` u3&c3akjcL ',_ ! "VTT"i#Cbh?bD*mdI GnX%>#3 d3tM)ܝ59}pw2x'dNr# )IipZ;rF8:at#fKJ c(gP|-#P2Vġ%u苈C0˙#,$PMr>Ϣ 2#bJpGs)i'B+TF1c<9? @b$r俒EKAzk%F F+u vČ'Tޤ%Z!wJ R:¬jL(cjy.n"Nztb?\_F{u}^^Ay@AAJ?3KpuEY\u@k  @L>bC. es" 91M@/>Eθv}yC:PaYz5SUb|tıbG.pnF2Hz=j&&!"_'zFz>ݧ(W#yEWE9DJACubX&Ƃ) k7' zEv4PDSTxn<҆ ěxÔݧc58Iq} !1$ WQʐ JrN^+$\ݠ>ŃSq$+տL-oGog\h ]?"`ȫ?oR>-6a593~_M%];>o85gTl8S7h*n%_MBS7 )3Y |k gN A0F[;yѡrґ0fF2S%}OMPlr>K -E_-|=Qtz {,(˴eIB:X>BCb8@)D"Igp8Er>]_@+E8[jry| .[!fcGUk~䊳'k y^r9jDD͇8kן^0,s dumd?{϶6寨2/K sLNRL:pہG>ذs~#;!mVW~]{I\VS>F$p +W}I.e 65e o6?~} /A{2Am**mo#c9xþwzUSs g]b~]}\#;m|<77br3''As"y;ȋȌK!&7σ?MÅlwC6lkϲl57{f.w_7SM&B',S k<v/7{or2 ]*>vjnFb>~NfX>~"-\yjgn,I!r}ԆB~܄7"Sو 'ReQ.s Ⱇpk`߻5; ӋA/H.C/2zY $k^|rM0 gG٧YXOvV3A۟Vo_K}S)Tap>:O쳛-_ nmup)peÔ 闃 c_G&d);J4g=xCPECg"\NS읺<_y~-sXR<'^gY:Pa3Oe'.w!RCtQ}N})ga껥@f㬰Ei@ C]nMi1PZ%@*U^ :ެ$R'?k e |dP"41cJZM]u4FyK"HuL0D5x qK;̈f[tN4V$xA~@cx0^pۯF3$~0SqQp⡀I-(iքx-iSid[$N6d=T RSHs^Hm[Y*VR4+-DN1v|OnH=9;&NF9t,%ZYDyws%tp-IIBCKi>!jH"jcU-ӀۈClZ!W V@X QR$ /JP_x3=n_LzXO5ID&鍘@xrt6M?{Ezzp > FG]pY9e`U:TkG7[ME(u5G%jxyC:ڔ/yӂu4_&q;e/$䐽wd% +Cǹ%`Ik7 WüDV_}uj@dX ckmn2PVYr7-)$"66 ӫs; CO|$xro҈NAp ~mb㩈ニeI@j=^ $rC{4 W/umJj,8&c:nm7Û.+_d!3M <3`̟DJꤒR -\o/} jWŏgnMF8H7%Ȟ֍&ҭ__"?bz&#_* rwT]yύ2ֹ֛n C.'*4߰DFB_;xpjqyRwǹiZJZ]6"dTjNOn|w3m/7+2~ix/k2\d3=Y1zb]6w.z]N6DpItxp{o_lGy\`xКKzw6"$8J&;}aZ zl`9 Uވiܭoǯ=+.IT7\Y0鬸[x][dLK Lnf owTu@}0l`YR#%뚰Y )S0%l?5[<$pyJt0HdqUtLɿnQW6fybYLnd$_QQz]Y$l!T|RJ#顏Z }מ&GS+lޔkPS̡,AQryYV:]Bi])ҘbH!ߦcR8n<ͤAaiU'X.l^*aU_<U)tԄֺu;x =<j(N$R^iA'Ih>H;GAzZ@Hz+vAE5M$Z5Ea8c+`"ѣT=N"gvSD XjMH; ׌Z*Mav9h:c$- +G(ScKiF[4Ks pJm Tr6ID9Dҁy*5&ZQ)Ky;)4Ԛ^t.ӧ=3}VOsDL:G$Ld4Fk*KFQUÎӹ+݌B>:'cIkNv4_ǽHBBKi 0D /*K 2ɰy[E[0y FHৃh乄ךpjP^ٵ[μ+Oz=PR^e8fvl^z7yZ쭭'}귒Ш~`~F?<.Ny\/'`Lzqb ܌|Du֜} pZv[fM JɻxWPSX4lo.pA=ŚIJN2(ʄ@ N[п^!sY< mEADrhҺN(9G_3$JсV)oʞxs6a<}+U ~(xǹg>c>mRa-罺OV*鈔:!=2p3iSrbj粏r;VQe1O 3p0XRhE.85yaEjF!zjj~e;1{b8&^' RU(P*<15 e" +]Nc\K*1$"h>ҳY[YU 0z(CkQ@8@R A,i.fEk /|Frn~sf(DEvs P{z&i.9qJ@cц6sHaUG;Wm'PJU5, Ù%')0(W>pKX4 2G1lYɓAs!^ݜ-mc1sfIL较p@<6_Hr[Y@SOb:{)ԆȬ,3x3;xE[YDć E@.x~]oz} +1!*ϦZ烎co+@j$,%:_쯷.b?SU+;y}F4pSaKEΖ@MzK:n]jRW;ݬ,\i"/t2k'Ȗ'ؾgsKN2.٭Xjw?`n^Yk.T>bs9 yH(3=Gn<xrV}`+qxyl4RrJV0NU4p&pdWŝ5̈́pVAA\RS9R<"cep3E3E^hW6bAŇd9^:|qW*塂WlI\3*纃ufxJXj^"pfv&$DJ ,i =ӮP p3~~3T@t&q>Zb (~k{?=at-@9;k}_8FU"s -'}cXBu5W++GW"*4 6h#zOk`au6IjxD&PQEVkhۇ~$TZGz>uS˵ el<_᫾͇)/#EP"PxAq6ϊg >H+Z1[ͥߙf} !$777kAzgMcen~Z#6ri ;=qJRdH=W5֗p>3ne0hp&"MAU%9ls۶V]aZ?kJsO'@[GDN V yfbvmh ޅRE( H <*DtM'>Kde\i?.wזva>N6€8|wOAX9X:ňxZS 9cQ0c@]]T(RSh&r ,zi]9jrGl%+9ZAf]8[eŭ2P~$kw\0Z:6ōMF)]D\VZhzޱwMÊz@])#9NYo{LguhK.d]ft7~JWK$<4wx (,+#4vsҗ}С/kV ʔ5DjE-[C#g"^#ۂpאqp߇ Em[>_Z-C}ɑq?o^rA c cSe?{Cׇ:1Mg$S:>[ֿYM?j2}DL J-R4ڔ́c`dRLXR^ OZQ٠fQB`Xf$[/s,h,9T UNQ)""A(u ]0%#5n#xpYzl1GoC+mJ3.&gB!3dVjPY04d':IWbs2'[kCz˷\!~owrOiʹfZi{Z=V0iQ䥆 1ӨRȌ<^9rn_n~ޞWWܓZ''λj^qIecD|D-OҖ*g'"z؜,dϽUT > ڏ VcAI%K=@V[ٙ)Z0sU4VppdSHCҶF>xrh9t$t%$WUM6c?$ k<{ W.]{bgTQ3YnT{еH=҅:P4N1{8GCчVgk5 3syH+!@(CcoZ@5o.$yґ1J|-E"soqKcW;cONWvo% p^dɈs, []1⦣WVFrqo>M`wJSj2 @jU5Ϛs9gl˸ï KP*Y\;lYYִZzHj59JST>ԩsBҷ4Q)MdʃBATQpg2Q8]=4Фꡩ@5E}/!m5h;PVIe*Ru6 m2D]oLjYE8.P"2fr| )|uay{􍯘Vr OE9v~i /` NhT g2X±lh[)uYt?Ф*C49'52lS |Sc>eѤ$>ޜ"<6,C9%ib6ETN>S M:SVJq+\Y@v.jv|@~ 4+iQǔu\3怢-V oUV4 ,Zb@57;2LcZ/.ܦ#qܬL47el@h QƐ\^O`J!DjZGM)(u.ww\ ·%f`>2 Lk)E!A6THq]6SQ'@} ^Stqs.m Uzw-)l!Ma i [HN-VhSbGe@.QDDiEIvNML_V;&S`{oҒF&KKY>;59i{Q6kojV_?RbX%l,\؜ԩl%4oYBwkl  S,{Fu[-ubBJyɬ(̊Bɬ(̊%j Xx+[ϺM,y W(`;X9'hDSPrk4#3E<bp|2&4܁dt @p% +["d|_ 3GJ$r$։"84M$3J‰,Ib*A V [AÃXEj w_`D!K}-I8QHt_6.r]J*V!c8(>BR4޻u\}U)[2+]` phm Ho3Bt[Eڜ, '+VK"$aJ(.)(`N%RH*bG";0;T.zفhLFm0Fql\6֩6撃ÿO H9BqE=Q{dNKBA7Y&=ed=Uyi%t+ڼ( eyEaD*bum+uKV %|;VB hCrj# }YE : 'sA:o^MIg-콻?Luc]}2%fd-8o{GQ&aPҿE?EFXYdqnB@hqqG >p K;`?CW-}oOJE}чwmwAKXy5{<̇ZNgO{h??^̞ILw}'ۋw2::L:TG7' tFx09hp#q_M5}ܹ2݋w_UxIHc@vzfx o~莎j?nPrŝݼUn/nOS4uN ˛Rj? a Ɲ亮7 eYwd,{>5k`ԟN96e`u>Es=x VߤnHU {qc?|p|3poØ_|ܠF*_~{esi+?b(JE~~vzqt}\_`b2^|w~skIʅ@3|w/{BR!ZL_!бtBOJj'qקי'~AO7oQ9]&Y,w>Ly?Aba v틛f`l M1H?7xѳa38%MeuMҀ]w4$ry;dY;hit=*=v.:hbӝJ7ma2Hoi^K[QM`qj%e!w_.0eg=|ڋÇ>jt L 2# p}PK8 x2˔B_2Kpt OK2K;4ZWoo0aV SmYBH m$E`䟩߳2-l.`jp0`ŀEmU ^%ܳ jf& ?c3eǂiHcsF-NՑ~l}ITpT>4;1%)wLcIv&2AkvAkv-Y^j/ ,"w2>c4tSa$wdjG"+U;/?ѡ)^H9 &=T$ZS&[[&kjSI[lwna6á(EL2\0Su0 >$9r41!ƭ'sⱠN  U;*TxKK:.|\]S\h%WCi!Llyn10Ux\-!^/.M)kj1LaHKu [jJzjTH*| ZC6;vUH[(D' m-B#=OSiքKZ'cO-@ddHE MP7W|*unk4[ 颊$ )$ :ǏϳCz~METlGpU<@HDF3M=$\nxPV8wE^*26+?(1;ZOsVVRkc3p(KD\" Q\"$aE8*H3~`kƾ_C P gNmjܚ-kS)].NNMp| iinBO3}S?<5ȥY3M%u3j_&k0FZ}|p5w3H[.ǁ\/i^ ܞbDɆo#6?Os9FU@SgT.ě aě5ɞD$ Q y$*F6[6uȓ V74!o`ol8̓Vs6"A4"BūRa8Dę{uz;S\*dpՂE(Z&aXZjP@@bV#dJU"(1o|d [1PR`0D D Ua 2 6̘U m\Ъ'(mZ@DDOӶK<>08߆\a8Tx^wgR0#FLi{"D XO>i8FT* ƧLwāHapX}U6A1RpLSΐ܊r^jb㿾hF^o 7"Rᓈ&!T,'Aj1^k&S De9 ve;r|.dݯ#,v2:7 Wq <=p«y%$zq Ka*g\J'K)C3j{3lkgsk~ OkJ IsNHv5A޷}˂<r~c!K bBH~9Bc0iSNJc˧vh^c&IdΊ߱ Z-K̿ZÝ 7LIaX !(q:nԈ /X7JD{LZѫK )8шlԍ %5}$XɞS7}ıޚXo޿+s&R6} Q6}O5F`$i}G1Uo>\Ӈ 9]Bv#_WHs1LIWi-dR!JOE_c9I93r3/g=lRA'7-~]28ck/G/ϥQ0[|L mލxN!a`)YQvl;-l&0`*G͊gNju-_m{*ZLH#|Ѫcݳrj3y$=z7`LJNZbpx5 /Cc1X{ ޏY5FӸ IA;|9*| PٗQ88;C޼\s:)!YzC c$^ח޳BaOXdI.̊d*Ae*/[Ƈ /qgdIC0'l)p*TfO7CXg\!I2rƃ79XJ,'= 8K;>h̡ z)Z?&Cc> jypmAcT.z>/>[e(H{F$}w}X%(kV0_slIvkʑ+&R'\Dcȇ(eTR[M|L)XSu,Π CAq0R%PG~2Ոg2ہ0n.]&A$RLjcNubmC6Dv6՚1n hKN"T3AK_e/^w6ånlVS3SRv C?Vb6vDKl4*OCJ U^(-9^OQA4u WBc)IA<$am(7zD "_<9m䣵B#Cm$s@4CJ ZT 6+gt56Od lUa?pT>%S!#0<˰TxRjWwu ÖSAGq*h+".;8@,]CÍHA f _o69#i "}p]# ՝_H4ҲCuJmgb.8{{uYFvgu*+=$f"ݦLkq"Wli/kqUT4}Y/GO([~4g{rjY|l.!eD>yTYp,c{r-s5@zn+51; Kek n!_<>?^'Czv&{HR7TGZoaq˷><}5/_3 W07z'-@ dUZ՟>W?4z-jTRQ_mg.;p\ЖDK~{Y-*u2ʪF4g׵R]p: 9Q A:eB0GDݬVBBE6,H9ѹH$^Z6)YYU|g,KR҇1>cX3>ciKw!**gYIsTR:2*]ɌдgU a1giiP9K~)k·ӚPs%RV#S`JqE5z(mc^%_HE RbN%pχ@ρt`K 3XwuFizqlMWyK5˖'?=[<SMugHrqi*$Q*C6YsJ}'`2'98!cLŲagW Uo8ajد[5PV "4P\(rRAlT,jf3Z{EjΑp߰&a<ξ%5c֮P\*p\Czo\EPefAŠLR& D2r 8 ;t+ *y+ g}?ep{3ܞ w PV`uV2\)'dEp.T9t`LJpA| w/k-t죾\XAԘGu|rq6wQ-;7_/d(FC孿+[Q|wyo@D6e>~Վ,q>.~x=5ΎlBk&{2ǒ $f}yVɖ )xήO7c%ܴ֘}\Av/l%= JRRIyإTCX*=Kn/RP9O2×˚!B37x払ݕ:">I- ob߯?2jVp冓Wl^sz8[r_d.JFm*#LrZ-kjrzCH>v8H9+}j1SK}E1Wdy#4T9!:^m}jeva[ HjcѲeo,[ڳzJ'L0TIieN9_Aʮs6~륹rjfG~{8$DHl‚NIjQxcH1d6.m#\+JG֙t>[J@?F}Z' h} :ZGb 43!$g#\Cd=fm@I79on,+1ֱS?-PtbM6IOL˥N&XûhuSΓAѕTYh/9)|\k #C@+^7PӚ wKrvT̞Mq-E#r \i H$!3͕wanԜRa 2Q1AgY&\겱'ygsܗ`(Dpnws'aargj9HswBI2zr6+tF*GBFEո~scw}LJ¹L2pm(޻Y8l^'G's8|eSe ͜%K|)Yp/`0Зy"r+ FSrU(cX. &"}d&: &2x>X0A&sbǠ J͉Ӂ9 lvRR33s0;KK7Oal}-8O>2&XL6F^rذS^r Z~t]U̦jH%lKU21Q+Sµ!SB,bq$+Gٰb3L;C؂%> Aj#ٺgNe\n+J{D}eDmFQ&B$hߠPo1Wg&^ IdU[!X۠]2e*9qfD  ^#&[aGeDvږ6x9 o'^J;O `)\9Q`$+{7܀_ʌ`fQd$@*c4J?Q\h֐5T2$S2Ql@0Flq 0Fz"hE*2(HHtm&Uۋz!DFLߗ`,R")OӧO5.%A0 ]%4$'/(,h,LI܅3A-$B7 {T/&7F 8f$ AeCCIVI.$O!,EEG[܊"2dNB|PC mACje5.f$U>3]ŽY/KS/><ķgWw^ S&4ί4Q~WNѣWZ"^(d /fV|GSA{OFbDO#7pz%w.%$:e"~~;ӵV]`ҦDqvz$%,A~Oy_J.pnZX\[heqRWjhoS0a(+&-A9kY-BY"dW!DȊ"9*%ohٻ:W$dR6E co^#ѦH.II^T8=sfD YLU]WHh{TSbۺg멒kw;MCԘ }Ǯ&'05Z5$"Z")n~!E"GHӨjI4~ GֱM8"j5fG ~.PΦ*Z m\uZ2QRm0+*z@dߢŤ\XX5{0HaZ:0& "> +FA lB,MlFh&Ʈ$yD)kj* +ԛReXiw(,9TFԺ!*8nqT:ͅԹ͠RcLbate[wj PD hBV_X[aQJ]BaTN;[KmaM=Lʋdr0Cy% deI%A08\qqv̠!L7X"mӔ@ `2[eo F!Q5 :ٝjȖRW,M7ŨmnLoS3@Ȯ'8;]0 5Rwg\灜fq R:CRzKvȥEO N0S[fH۩3@EV.E͉Fr3S- |`Q 0KV'F{pN ǶzT,%{$zZ7UovP8i|rJCۗA. X P)P!&rTjcJO ooR.R4LbQ2 ht)djqeqp>בASJs+ރ5Q.Ԁ3p ("@yWJ(ji E!q)(ԛ8Ç9(H:?J 4Y!opy1* [LIB';"Vz-ƖxgNA aΐ*%S' .d$qy9k+¥E`jےGԸ*1 !):0NA5[I<3p.cYz8xTz[4= I4~ \tƇ#ۊN}c$}YyZAM[ [fEԋG289^x||xzSJ{F]H?C@TSc 6ҵTkUjyolzd[ Z"Wpu0ͨ.c*}ŘqL3Ln?5ۇMi ޴q37W>sƴ'fJvovKMB(kRe<4Y\v|ECgiєNjWo^MbӃgzY# ]yzlW\ddg8 -CT*#&#Ǎ >/U/F7kTlm[At+۪=pM7-" M ׋Om1/Afo3d_ho;}]O cGV* y⾭'. D/s2)`Eflxi6BEm(' Ϧ{}|_zwvԥquhBuM MNPK3hQ:+]*?ft/=h'=YB j@C6AZ5s]&e 8áGeD%&HA*~-''Hm請Gd?WU0mꖸѧBcYTέoc [P[ QZ$bcωj{COX5Ltب.@Eӈyl:Y[T^UuYqܩ12δ>8\N<7a^syF4sb<C.@b!Ik` ]LHmkֱ1]MjjƑ̂J՜zDCg$B7]wth]PYlwM뤲2Mbz@nmդ.Ib:ak"Fh -gH?~7I3RAKW'oLP6< +]fتب}>l jM0zPDg{u m84@mOj :,ܨW\r8ŷǺa}u:%ښ D26kP5 5hLtĈ9ⅳ{{cF##Y"o&$GA: L K:faG怏x-69<4'hRJ jfC(b4 .0iM2autE*DQ`4v '[0h<@nFu8%㜸7O١?@D~c8>uWï|?,VCÀJSWb`j17&Xǐ[+^Ӥ?-8խC<>JbN V1zj 4A-qȉ3"̝;c-b *"cұŠnvDJ$fFAA|Xtnm` vx*e%U Hѥ@Qϰ 4,eА.m e0Vq b`ު7;h5LN{aH7 L>*`xQ/Fsh@DR {Ef%uF2‰ofHSlmoR+x*<>P dĥRKT:`ŰUonP,iMDy;m/!U0.5`8hT1vF)-t6ى)eH sXָS[1,oTClfXKp%i &uK]p0PLy2EE;yk7ډI2\f&T/KcNAQ]" pJVV"L9x͛2|VD3~<ʖ5"vFRbP͎Ƌ& 8lN.8ZAzJRG*i߹޹&34nsÊX}bg8rv9 .,g,UNʿ#3TO/Fn7֗?/Qy~/n'hxwؼ_y7'gۿ; }?N֧93ǻMɖ=|ji{Mɣ!s^^}QgԂO(R~㉺Rً3MPlM⇓6P_?i&/-˳jIJ2 j3p^ _X2aʻ]RQ%}6}qr~蟮EHfK7K8]F'bx{C4ov?X^2fBsd6ז0Kq!,v'?ן9a 8V]OXN;`@.bf\SڿO BXkN.N'μZQ7cz @W$~E9܀FS}p5]2JkIv%&m>OhhYDo vе}Q:\G'xh%q{ohlZI#f/֚pO=6VMӟ(gO4`[ji3D 6,UjJR]}| HӞ`vdS*.g'6v{%IPVZxs({9Ύ?ㄖi0U.;\'}7+k+>b&dBV %b/֝zw*SEt>6Yip%,~>?{";B+^{ARo`\[9 Vc>P7αD* ѫV`jDt\mmo`Zl<:]Jɳ Vc)_ 0d`J1Aks3ǖ9&ΏKߤ5<*!W-iSWfBY] nfEu#K`-lCuQϔ IcEͥX>Uq[6fq?h+S9r*o](q~4sVXXLi͟\ī#+xgZRxMT l\FrS 6C`\؝OW:j5}ӫb3TW0Xzo3{nYW$eǯD\3*++R&̯D\; ZT!2-,)8]c]C@#9yo E_Au*ZVd|;0}hOtP&_'Xkhb uDƘm()dŪ!_kOeN5G96b;,A mD[No/ed/zCRtR>TںdmÊ,1V:CZT)47-6RP4}Jl)ٯ]8%}K%8F_eV VIټ}:* LD9y6UZT) KՃqu0Ol%ʹbxo[ޥt,ohciqşYq#y[T/{42]9 I)dT@m*Jj?l? z DIl,M#j%ѥ sF:Ԕ١ٗ De]/o>뗚Vݧ!YQhEBs+qk1ﱏTP9tG~Y`ڠ͉g(e9PUIonwIY hݩ&y+4 E &#qASis۪Ud%.\>Ʌ7H)JKy;̝8FD#e,Iܕ/57MS=)) ?;Z==dC+ dNͧQܑ}̑lPF:rב$|̚@Ke;8f{v&i狁%bHP '?.9rtb F"$jܥ\qA5rKvC99I,F`:K9l\LIԪ\,sd`E ,d}hyv.z.N zyą2b-KXYbqHf4H0<ȩޝgieeCGͿcw1R]woD\HT5 j%#gZvd2IèIK̩_Uw# e%T\D dc3е mL;؈ilL+}thSo yuN @ VR5@^qDHq7ei\O B=7?Uuĭ^^޾S5{uzG陰~ꂌ5ϟlx[_kf#(⊕R\U"n9J_洔uSeؒ=q(A7œtJ;I8@V:s]% ݷ\D"4(fJDI̒8[ .' HD_C5~`Uk+='O-K$z"QPSNFSĂ?`6D#yƘ(|NDA ~Eq4;$3&Ӹ٪ &x}%)%\̙"rP\gx ښ@e4!5ooNH'H vZdZa4ij` gƴk1&3Yi9@y zzZhd%I|Ck hٻD*93i41`z=:)b`>ӈBQuU8[?8OA \'Zԃh9g'^ @Z;8U8 I8@!p9ĝgˠKA,z᧻Ԃ*i Ԛ{St/̮hk"&zv) 4z]U{Zv~o, |8ϸhB9/n/ɬAG07UAj ~Uf~!Ϣ41!R Nawit$ GKŊi\"!CU>n8dd8,_i8TĆ6|U͞8\~\Ttp];B?*'kJHwzҋ]/rpS[?e3FE bf( _{эgA~vjGN|Y]?ǀgtM}R9[GRf3oV W7D]Veq)feㆼnR5<\&pd.*]*:-AoɚG/>;5`s R͝*䁢BZ{}rX]"(m_/õnMCtVbkYjǍGjI,vF.֖6UW T7h'E/f_IəQk!a J2ƘqHs;>"Lf<-bW>p  -u_U]!묏q@ 6|.X8Q8;n#!R=#]X9 K%u/-A@LKHy (NM ëjwh9Iߤ]Ae ZYh\#F#_/7?n]K'*j1>b#bO'b1>nv7lEp04~RI*ET,43QҔg@*Tj oU]Z`:' 7c&d>n-9QR$eGIvgAP1Ӈq*1OA:b{>0V/ l5"B"gsVaquBAZ{c.ßwvw?xE`Ԝ;>$:=H6L+>eRFX}FHCY)4H ~֠/q~]h!Q.ěQ5i=U#R 5xFds T5(kw: 8iZ 4NQ HϧgA󩹏tjzh %ZSwNf.02x(l2fAaȈ$44H#5<7G7O񽛧4n)3diŠHDB /ܾrj ƔMUkE5 !'<8QPqt˞_BA3F 0򦣗iPR0""ϚC0rcBARnrsmq/EƃuXa,>aO*;iv*Wt(u;CQ!u0$`iz]!C)`B칃W:+/asU?ݳX&6~r\QETFD@jVrmȶ M؄M^U]sqFU]{#at*7=ӧK2w-$[GA<wi)4[5w؜2K#I$y)"NB/ϨVN d&A9Q$#޵#"e1} $hL/t{^v hbIN;EIeYxt.c!Wɪb(ˀ)A(nv_8>g҅ؗERSs:;o s$Ԇ¡'R=5&>q/Ĭƅ%zQji2IMp \1ΈsߏR"q$Uf૝%g \|"NG>X>`Bn϶RGG L$Efh CEmZYJ(ouj$4'J9h!B.("IƕFfиI"I2;2 ÆR0Ȗ^ r cXeG<:$Ja&C}LQLR<@VD$w :ygCiMjĎ=W^Rݟ\Rݏ= X|>%ݪa ޟCᗒʱ1,@|1Xwh܁+ i=hc,X6e `DZӂ2}໴4rW'>65U2ݟ LSc:9:Eۻśdk4 :=msvS17k6k| 0_7\ou;h,@87E9r!B4cY#qMj״5~@!}FM~aBfXd`jj#/MZe4QR_0Xdiu5I4R3$'y1k6ԈMߟZ*{_Gw1˓|2pO;1ܠq%`mRX/"W9U^zKNJ_Ӱf1 ߛvGU_%3'Y2ѳU9sVju:$ ~u EP?ymg?i:O?X0큵-S.pCUrP3Ռ4qܶ,.vMNs~~}vhP9QReUU48Fh}ؓ B\-$)I0^R)tRH&qhd C}|^]xF@ q'v T&1&T}aOSG^Fل=IW(@2]D={Fd z QQf (8]CT|W It2FZb•h6|Wr*&fOI#%Vbo8D7~T"Vq% iL$+9 #@ZR,u&fZf84[4vs8c#8݁%8IZ Ȋ22)FTFB$hURZshNPJ m$ԡ ST.Аp6>5+0=48H8WPb+bJz4"JbgiNJmnGE&>|lU.31MأVFrP<\lbH{K{X8b NJr\Na8S0qZ'dh=p{FdOɷz]i bHyՆ]rcUبFQ2c7 hzI5EB1G912=~4nas=w{F`I@3 iiq${OЧES :{q4iCMߛ/q~Sa#nmq`j@{C({˯W6>h;ٿ$s/yK+Ie0hd3P3_|˨޼aUxՒ(=|iԲǛ:\PSr aC\S,0!/,qSdgޭ!Ah'/7gu"j'~j]"^ʹN?rlth|TN~,(aϯFUBgb.^jlyѢK54jk~oUul!%t]%S+mJcnO kTk0GLww׻Nͺl4WÕ(96kyk p%}x4k$+ bni|uFl`чE{F ~lǑDu*\ ?cwk܅Gd;W E~M7Q.2d}P:$G"*P%* T)kKw]ݮDI;[NuA)R&/*׽(SꋟJQO*":] w4xx>UK'uMzա' v8޽nw^0oh^ aQݢ04뾐X)8;N͜NoLw}!*ק*_r뚘@$R};#Ъ>@GVFgEQ K>G <|<;Z9T(m.ǸN5=r kMUK>CUAM%P?zVTQqq2h#?wJjwB[s܄^փ;!SР:7HF%V*vp Drm'Tj $@yϷq]#5JC$JX+rAD4V/E,RY#jt,tr SU"}Z.&ekoj稩E-zpF=WhL(\OtEX52܌Ғ=rQjEa$ `ҁ2xB4#k2 D^V&zJA'}8=;5N7Se:/AsvRe*Y1i*:+u[.)KUQKUQZꨨd@ HV!HV媒")o,ncsS$TEHh=1¢얾9xXHV܇PMsnnMf,h*Z" D) ĝ)R DɀYh trAT%h)?$ʣ[\C+Jw%yw`j+]^&}d.3fzغ[g7(ͪ8M @FІɴd :||ia,$.Z :4J PVist_BN ¾g4Θ*\OqӇ5nQM^pG>6G4sPGy_mhJ ʨ8RԈ1:c2=n(#j f۞eӭ6i`Y\Ws8ؼ,bo91ObƩي~?嚣ao؛P5md%xVDEV/>0TExu*/ ~ &҂zwu~>y*!'/hu* P\) 9뙶O;>8Aj.0j5;9{M!9v p0zGl&F7,?&;SY\j;ND:{0ٵns6$Q)E6O<[i*98Z(za Fp]q%٥hr{|u4BX̨vEFz|˗/^4d_ݼkg썴ҙm1bγT><1Rr5aT\;yq>r/Ma;j^,ޟf̷;_vC/yöγγS&$N:_"G~>C7AnGWZfd _i>eg۪摘"mz7%[Q3tzFHkN8ʡL8$) I2^R)tRԇd5Hx/;JOf^~ʙ '+W$Wa D O:"%'X! ejќNHP(R 7|&Dw9&{#bvCbj5F?2hk5hڮ& Fq/? ,<%]fTuwZ}8UBjK^|;VћoޕEo[$ljlb+g\\%x[ח_Ǜ{ xD,NBE%Kﲇ8<˱M3P9qxO='hT*ӵF[uxyZn!=e=i4MzZkyt'DI_X(,HCdb2kLjoQ#}ne$LvK֕ΤR7f.łQhk':/U֎J1Yh ! JT{Zy8 }!{V/i{iK;H爥yKy2Z=[MN٣zRMK4yf VGsd:mŴ0z3#4%#u`/{AhB Z:ױas:O&#Z7FmG gemUivbRB|a&U.lVS ",K\uby] eA[2yAvGMSQJp%"_[}^wdʴk"7q>}pDs<$ 1SHJd.Ɔ)\yXTrLCMjO>i CfU`υ:u~2x˼eTهf&YTuS*}?ԟe$ Xf|Jukb+?|Y}Rqy3"]*~ZwKVZ_\uwSlՌ1j;uY*7h6kࢦT!8>YȫiXm439F .O6m keN)v~ 0(8eQD.noeF6إ&K9^7l7{^^T0ʜo`ϓ1Jzj&IrΎv횦rDsk:"1d R0Biy8ZF,99I"2Ǹ6,,2Q{Ɉ>I& dΖ-7 Nr|`,$,dͶNGb5gŔ2A50UZEV2H8~cq{&Jr"hC ҃҇,H:U80>Ns:#:項Y$rc$GYP1XC>ˢP&vLِ@a!$AIu .VD2s;1Ohg޻^}6 ެEDJ l/Y6@ZaqzF9&xo;󱦘^n9U:*H5f^J"]CTdj ` 'NP [vd'pd7P=e`uS O82FbX`٢u1Լ0_80zUMAdm`<nbqN/^Ne,{ads, WٸhBC lKBa]V%{͢\``R*(Jځ{p`182zY|+dnFÎgUqW˃CkQQEfw&z Y (2q57(OVB]9Ԟ)IoSa1AqU>E/U=+V٤ L&XK Xtҝuް::bLL%64,Iɯ{ 'bQSD$9( iSX 2:S|7'' RzV!ݵxGrzc}zee؏ƚ:;ZcѨ"qL];(Al4Ca}'-kLrh''*;1O+[]<>I:"vKF[᎜5Z'Fґ)0]'J +r7_Zba 2/әU>j ۻզToq}[磒n隌-N]8~ɐW:ݼ¸~3lCO=dp:zׇvِ1S;jjgaг:`sBBWN.^:#uPDq56:"!e|.ꭵ`t`G19^ 6fRsf;Ҋ:9G0dp +DPNjF+\gB(Bv!]vDM y] dQ3qA[=> KG9y-Yp:g%(|+WW7 PP'Y"JV*hّBQ>BZق!09Mo9SFaa)GՋ5#j4iBII I@H LG8NWɷDrܽGdcacPXxrS63y_j?#5*=KC`|AphVwjfWU3Q0gk$GƭO 82v|\E=;NimU n/XT+SǴI.cK(mKjء 5:6. jiٯ-[KPKL'}5` f$Y'v4lg-|FѬVhʶ;n2]J~a4 L+G(0WE qp]o@ZNUo3 ҵhDz;֟ј0]w sEMiv.JN(9N[A_5KMhG%?t TTU9rJAnX9 'jm!,!ZGvp[X-0YޮEr_X6eQ4?uo1`62w w$;f*i<5VAfx: %3pjZS5KZO!rd-q#flroעl7)rbQ,}V#'51tH{=ϰXv7b#5HFhP[ K4|x <9jhZ}iwfdQ봜ާia8(+i=9t3Y[HhK^1n3ofo'tv\ۜm `b{DGf suXTƻgYMJb 3: sP+ >Gо̹]y|[=n2\m~'D68/1׃5ւu^y֯5H%l|ه(&$;1|l-4> %l(t֒lwpDo)MlmZneG& oעTDUf|>]JgRNby)Kf&aQ0|ㇹ`5D'ugFwKٚhC0ʾGqɐL#4]V+דޅ/Oo _—^[!c—; B(wq8ټXmzy\&ߣo_c󀬹8|5`!d@:\ۡq2a>pfp$Ii 皁=Ћ>Kƞgh}Ө i{'^ PR>W[7|Y 6IT x;AY}zY!~1zsJ ;$LgȰڱ {FM>լnϭo>]nX>. ;O=C)n,V7y;WwoҦiWL}2~?߿X՚ްg<{_Owa+vVc#)_n՟so\;Ѭ1؜mکLv6S~ȫVnx&$j '}+:gdanxJ @1ğB+y c*{ʩb8^Q\xOѝpLp!sSɹ3}) 2$i;q7}RUKBoW?ɩ?VV(O|WQɞZef4Y׷׫)Zgwy1|QOixp1߮K! Ŝ>V8AwF:'A0í@Asnwva* e"cs,+/MTJ+ m[#lA;7I83L h!k+H=|#Zڻ4vf_U-9_57㛼*SA+u򛼦g\MS ؤ̓\['VVq{f &e=?},]'Qޅ<wLmb2T4sgك?Zˇƙ_N"ZCf$;UC{ S8V !l?& 9_/EVJ+^HvDYJSoV7r#_.Wh[|HIݗ_| |Vd1"9$3r\.iw%5F?_wf4}m)l6Z(~ia`^%()2 ?'EvF3#fӭn$l;䶙3Aul!BJj#&U鯳zɾ%U*.C޳YwWkQGRĿ?CH-tݢ TTRҭP68ɩi˴fHROic)o) Ny[7DgO s`%*I@W{2(EQ䓥8)TBے 9$e@_תs$ZC/`ɦLi.@9jASկ_NŸidr:o?:DZ0ߒtQQ[UH@(S냔OHXͤIYF925)}k~SzvwurAPs>U#oxO&?/J0Cdwn㸊7/9n73\q ~yר=A!2K|EEEES ,Y†**S 1(*7)KV%`^J_`O+!4q:d>`Ux,{a= YĘ;xԊfaKrij) aZkO__K~Qԋ\-x&NJG>*"a2 dn>hLzfd+H&ƃJYJ^D(Pz*K! '=Rޣ# ڴpߎkJ؛b6ĺ6J/EpzhLϱVJ*)xeqQad-C>_VR㾖3`fBd6p(ο?gq1]A,oKr=q:`:Md`U4晰ٿi;-s ,E{L8Esˏ,GjρZIeԩ5{d^Y 3c+hqrihX^]qؚ&9{S&ԬJʫG%z`}m[ۇd%f9k> X//,>IJ4Jmz u"u^ ??SM.WT0Q~wO^7[zLO}ůg멫r#cRWaQ4Rٿ< 6SteaQC4rz@-9qA KLn9#j!jޗVpj3)䌡6 Hy0/l.- rU$nمP.C^: 8WoeTs*_ԅ/(Vi 9P9oQ :>]YP?K en8qL: ՑB ?vx:-s5Hp`dOKкl DeAipf(<`$lƠg~ǔg5 cv"pu&ݖXwLSԵwL5F"wL>=baVƪM4]r ā{S"|4DKR'$=&:!`S;iB1{naK|IccK+׉36;v !my57ũDB^ɡ5d8ۤ\ؑQ'wNnt#8E΂:H*5U)W<8"J J*uព|F] Ӭ@3B2/e R澔_K)ξ񧼨1'w|&d (oHBq%n &eM!<np=m~E-$)OAh@ DœN0}.ǂ3yHnTg?pn˺p ]3@+% \0) Ou6K)\b/ 7Fo|1"ϹReY2^Lm^eY*@WH7Z fE3y#05NO'kNtD2eY-J"Y"]=Gӟ!gO'N4Dm;gEΥo~)]n鰓ɏ8M^o7QYW*H(JpC[:ίI猋yVF h@;a0b}gCbEL-vAij<8^V-s|i!?TNwtMnV;=1!>t~XF{q'd yb)T%&U[#v ig&|+DYlz7#{I`طj>O4ڦ'>Gd'U-2C)x2DILv>QQ6V dGLE$DPk\+{L*-`ʼrwӠZ2i)q4gZ (*K{*!o>_ʚ`0BLų.̡ee,rHV{uGQK$TǁH|6Ac~XRP'9ƹP?p P y4l cً\ ~*;cq1^y^cN hKPΏj7Է*ȪPE]@CaMP#pX<٬~I6uV*S- vW@1Z~g /U@/lWXB񦀁w]Q΄0HiӏJ. 9PPJi;dE/1,Z'z1|GƤs<qf 4R^FF*5en>Ȑ?S{竧_|J4ˋ"z>?g]%a7׿|] u迯_o` wwn Ϳ}p07  (7W 4FR0cX$Ǣh:*f*iUY!SaK(UEiy%)jKqpQMVFLfW0m`4Drא_x]*}իeѐxm} `ܿN_Oޚ[MRoT7Z=G ]]*7f.u_'87;><\}dD 7z\ʧߚI{f0l4Ժ',o fU\bqTa}D\yixKc9m6&6i6H1)!! ["8t.ӑǴK)e<1|/鶮Jsߧů8hwX"wnՃ2s>3v:Zѧn$Q*V |-e#xsXKIB3}S56e $:`EbRs!=֡`%_+Mxz!u(Iǻ?;=#! 8P\(PIV̱.=D,c툍EJDz;O)qjV6iݘ؁J` VLR6VP ?Ǟ(6*X{2;z^ /-OzlHӱ1&t(0x{I S*?Tk[8>qx*%L;MD&}ް:/|UH m{3ƽ0DNrn蔋WJ}KPTKꡥ4#`ZjJ_=x#'}kqriShSA4.oeN.xNQKi rPbV`J\]Vůd o%7 Rq|WI*OHVXٙou/kVs.hڑ}橖K0%V>DV0Mk Jޫ83O,թ{\1tWMa,Sqθ e=R3j [-eec("&.&5u4f̲m:wsD_P2NYz@sT wC?m$3q֩K>nHATcI>ޑ-r 2jV9z/G0^7Bdq}PS;G^&hײrM|,<ɪ0eVz2Ί,ǮQ3xWh2nl8 @8Gۻ|S_L^7@tM^L6oӁ)P=05 PNNԢN ny=99᳊0@cWo4HC&Q*_rtGrvbA0x)!{I$)^OЫ$m}8ll*Xx5k!80Q( ޕУ|T%l,Ve!?ub3_Ut"8ޮbṙ|9b2+擤`f a&-x>"Gm?z4M/ׯ^샔L6]$‹Q@0LI)#2A~v7OH" Mw%$bKHgg~1w6k 7K&<\/#|i#)"|'cY}xAԨfcT:fX9⃕/ 'BxŭAL]hP8z8v񂌮j7hW;Jq,?'U!е}ߏIPLn|W>G{vAhY,YhA/\πvB'$R^2Se#g BO lD0&ddAT wy_3."2~n#] $݄$;ۉ}o>w۹Z4xpuɎFtJ!c]:lIRbn!g'[6gu.B]fcϓכnP~0TAn MU.GcE֎TJ07 \r歔we!=c0rܑhLڝXW-FXm{pD햋ADj*ݲݺsݐ)(S0rǂy,ӟj )!Od0My敖|+К?_$CУwltHKtHjab\&PDv8_ ٗszpf&+_t* g Ffd8Հ; 73h Ý;a 0ݻ_5Td@Y__'ǃ<|:s{x<uȓB.7sI37F>{S 'Ѥn_k%pi >VRo" m@s&}`aJQ ۅ_wփ3s7p9h=MePJFhd}$Iyf51IhA`_N$1C؀!ĭD ؄~ijj #5z+8ȢWH:&$mzDΧРǷ=9eMdzs7\l&fo.$|+ybԳ*7DS(Oˆ1m?;yuo./eׯb,AR@e9ʇ^"Z1"D  > .1G򰷄8%UQR`#(uJ2ei ס,`e7CQÞ W.7[fR |.3s}IsH9j%EX@ !z2 Q IK fI7Ä*٣1CbXyfQL"߹͚#IW_I 5QG38U&0J'X8vd4M24i A).D{y~vT;jSiͷA~0Q^Se 9.F(IdlPz [wigInBtVm50-ْIBdɀwf׌;En23kQ-d.Z>=w1.|2k}ПO .b6v^酁1y37+ sV iUZT問?~.>^dzeEuʢҪ9l̈́ E f%.T ~$RhSwkh¨hcHնc͊nT")J(Zbu\;kO2F]RJT&oi 4qO?@m$c)qɒIi%?YTc;K-`b{`U ~{L1y(Q>*Z`"Dg 8KBKdAf>H$eWZJ퀼 o5M"իSo2 VRZ{)"p fa!gF:oB IҢ,}2i.gpY::lCY&0(DNo.o狋ѧkY2JJT8`*碐^JB7"bGc‹+~ȁr8ώ#L 8'6}JyuᤰeOώ1R(+ +/),Э~R`F Xw_?>= =/Pp)לeN&pB-QC~!F"I :Eo,01ODHw BOE_1~Q'xj~ Oc3f>-k.GѤ*_[.9]^kalJ|7:+opӋ" %4A{)Q̢4. uRTّPi#Cuz.VӮLr;z=YzfmTͲ&Sجc{;O?oɇ&bt$gVbrt4^[;SX\4T[X0Th^x]T@XQGNhƹ90\zjˠfj5k)>8Ƙ^[=b0Q$V @ eSbYob% C+TH& >,Y+rj25yVdzOHbk _j)h;DS5(vfT27Ax 9']/qc&W=\+c '|{1 :\v`o i2BR}9I*I#."z8|ԑ S]7a0wv`[{8-p{}SJq+u)P~AaF H(@H|`-Z>wUekfYZI {v^^m4e?ŹeGT18fzl1zzp>"ct/v4s;hh7ڛuԃ˻JN8iet'RbA6!ώzB 7RUNon~n}:v"m ,™uF1Y(1! w"=i-H…K\<$=a6^.eϺIJ*ÅӵOJ% 3b F (W>Q[v7M ㅣE%ī=}|% Lc\WÌdځq! ZKI:H<`FpdɹzZ]Wonl)eeh@A/@czn"^fJ;^p89e4 lP#ۤD)L98#kvp݆?Lg'ђ%} Z|ef: ԞP8}.HvYI>'2\Q@Ip X!m!0f@zuʔ7Wf` a]DZ`VaTTbtj$SyTQG^WTjK.u nidAqeAY,5%[ѩeEvZL&T sI~br{H#AБT&kk*-8R_S=3q9y\gL]SiM[jM 'F겔T"Ҕa< d1~Ǫr;ToȇN\}*5% w}bdL\6:Ci1tB<{eڋR`ƌt2% [C9gc= AZ PZn%ߵr>`}q/e0e뉿[-0*H>HT);'o᫿]\3WE}G&? ldwN[\T'֟4Y!RYGw%QDbzzruGX3p<ڢZ $N/}qKiyJ(MQ:I4g `?7-~չ j8;C^lѓ^epކi#&f8Tˠ۰렵MR|bzItJ%ru ̡\ 6WG)q-A[LB afﺖLn!7>Hewn<^QV=bvc=p!lYX*hfޙLNFG/m  "J994J$lxL@Y"sH GR:}=fFA+pMsYKIhIzFB+}\0;NnƱHyX6/1P17. MZEHzrI]%2ܾRӹQ0#zx)<2ЂϭuHsRE((Ak19 P @Wǎ%%1 y2`.vn{UrP +9аӕPF9j[fd׬eRԺRw]df0G{?BxEYC;RQ*ԠZ;bQ‹fr ʟrqha}A=x-VU*2j9[ Ђ<ܚ`Ek26e&>BWe$2{$2}ll}M#'-ͱf)Ƚc˳GHڕbbݛ!a u1>`@XƸu"uOnf6E%ة}?`6.t]?mWog {>uGp@p9bZ *H(eНf}rk-*T6ef^d2$YFf<`P9 EEYj7S+$SL˜2ci-ېeiV{$ \xrwk@!i0'XVB9%~m=[kxK7EsJ>FL5d'HG0{r F-4 %Ex=eɽBB} 0NNd1&dgfJCrT:{B0 B@q7cCy#`䌾Yv&VxYھblF(N3(cf 75͜IiDD ei ^22j#1jkɄBD!.ujӸzb ՙ 6@[vK(?P2vmOع5ZJk)[ 7ռ\ -7_Ndr|S:JnUn҇[_JV&/n0ʋ{2~zuѿg4݀JEH3^16 wKHn}No/O^p4nmwA6[pNIqniLW2ݟHhuk|?|%w$gL)=/. WR>jDTE-)5;&͈XQ݇MzU=W~XUQ^􋼨{S޽yB)ww}Kcw_n⒮=woޮK{P ?gŷ.=.~D1$u[;I$t9L%b-P-|fHa=[ !%_OhҌ۰nX~ ֭+5mۄQ h֭a֭ UtMR˘9t WU ,HgY2JbJ3̅MUZ2G!"YIg (1 c&wfh:R4lemE.h<Ƹ7ٟEQ} _5Η;aO9Tz>vAvr}8L1Z_d8팆g6k= S|f;ԼFJG= ^[+Iu??֗Te'cQOr]-Or˿n~MMRh1{afOm!s@OEi\_J^T];6ň̞没xk #JksR7*1uٔT௙g_Ŀػ6r,W fg $3Xlt:=/=0*2VbK$' .vRK*J*sṐ<|ieCj5`/J/;K~ P*W3L\y^-aTr[_*>l9m +$b zZ$, g;!Vgav%ua: 9BzxwɂilF=s *5.Brn16>OįQ3E*}}'2g_jUqE?ًR^U#RͫFD&,qTWnމ$CQFH5;58%Rm)!\vb &X=4vEX.z18E8 ~sɰg\`L3ZΒW\̅ϛ)L@n`fqY=gyBǙ֥|UdN5&u J)=M.\vkomOs=S(.NLYO 5^)]']KO̸/  ?r"\q)hC g}:2OmO^{`]p9*"eE))qeOR0 s14})DkDTD/SXVi\ެ`SԴBGJrq8+TٗY#|<T(js&D8ΔfyFeLSJD!;*brp1N/ :M`{(w%!%3>Ը;UqF5ט'U"5uPR8b#%&]9/ɢ%!?<Ax\&ƭ' }AEqr% ''g޾jIbF)nM绀'\9&3퓌7># 6rtU쳬%v ^چsǁ5k9fY'iȄ@3d۴N(TqQ'q|%jo~ERQ. {tO+j{ YBHSۃW#-:@q@1_tCԚKxl #9rBbҟ y/&tmu^c^ .R%҈Ҿvt^D^<БDK;$G=u? yU͙1eKB8Y(lQk\dqpFz ΕjDHc\ p]ĖE`,l1E%&,xs:s̺ Of՟dAp:p||Ǚ"@r+4aTxb4Xm7j6:o )4W9/ : Rz1Xvv9m eٍZ7,?X1=e3uEoB&Pp:{޾޾BDac,f8X:M,. Jps`UFY%ia DI /2u(s]2u(s]\eŘS0+Qasˈ&w9#*"LBf-E i;f%:P|~HwuvsC<ɪA2t}81Fwٜ[^A[Ƹ刀{%/$`v)c0ER`\p> BZ] S >/tkF~ g9a>hh/Q+Ye9tIZLS5b2j)-rH"NJir;yq;A/Bs+/Q XTuىBk[v^'$Qj[:%Di?Pd174Nzf1 &/d=! +9B[!AՐ)Yp]ۧɀ&ITaI% KR-@r/rP d!ȗsh+m KڌNFu OTWB89VV!9F n5Han@ތN:&×"7ԦM̥r#cX9T<$'!ga9M/)"JxCrʮʼDWrRLBO-[M@`J\̩ V-)!LVAB` e2[(yUS礱4soVǧuץ) }Jh(qAO-Lɦ@)bG+`Ex2Xx¸-VKF=oj*yߥ}+j/! x?تI\aޙ $H,G`DZEHDmEobOwnw?ҰKUw=k΍n2Q"&vt8TwLyW♂?Nj3$] Qͮu"E\<}0ٜ#w)G;x[Ed"^HD= d-|QAbC>O$"wZk)ξtZ k0vEgOiQWٴci 2Ϧa"q.:G9xzѸpKF=GbALbeI%q2l@;%fawJ%cAWmdaýfÍ;Evw}t.óWdL3G/EDsttv7s`I{Cr5OQߺJ|{<%C@ 봰S7&6{ 267s FGmmB>"$ohrL)9x+CPڸtAh%;,YO/S/q׹,S/QGd!eKuÍmXꘔby2W2mqZ{ &a;ڵyma&'ק{9_Nsf)1>\s.C{鹶)<hƐd&ɅPZAH.\ _#M,@VziW׃= >'jYs0.=ƽz)jCy094!oZ(rA~[%\J-dN%LN1^mn}oYkj.vЍ~Vв2X;tOD{#sMD 䀝Z=ևB; ȱ#A߀"v2-n7j5߶jV%y=16+]N]0?ȍ]]+7;o(ȣFP":"V\✷h VZ’TDUՄ_ +)xL/o)"YϹ`>_߂׳ VI]\OE%9߮[Nrpoț׫/`BEOWA!_wƻyRdf?d:[Q]v}G_ Z#qŸ={7,.aX\ nteav)`H:~)/%Q/!|8ΒfIFBzw}:ȃzNGw`{|^jqm |pe+D~i*;XT:V㫘BdJdQ']d쪙_ns%IHPsEdJ]o4mRi ܙa:vF4QǙ1}tǺ6cׇ7ɊØӳ )4!x1=1lAc"UT0yV}jpn=4" *Wo[OH &II*mٓ4')r14}-dtό܊ Po+?U& ,<1yp>cY0XØwϷvV+=Œ*hc/9~P*8}F)|YvyV#<:iF?Dv?un יt =5Զ0*1'/X픗6 O+9i cXٻ%Wy Ρӷꋁ<>A,' 6+#iƒf }\83yeDvuUASv,F/W(gŲ-ܱbDM$S"tBq1(4Uѳ>pJRnYNpѢ?aMr3޹FEQ|ŷP j<=ZZKvN)%04̓k5|d;#C,Cc(Dg1rHNPzC`Bcá}I`WIFRԇXcFbB:UrusJ%!ܥ ā)*<#>`SUiavruPrKJˉ>fWx{ fs>\*܀gqb:}7>6"$75=OrOjvDpERH1ƬxqJ1@t\=lܠ1{,;V"6Bo<,](}p~/J@%" ɍH﷢?a@TJ/M~Q|oF5 چPmcLRHx*A9_ɿlJ wwG%ejBRvK3wMfԬ%Y.\3D`C^J$[_a@c>i㐀IY( Qvj-lLs\oʼnNҩE|5dJLGXa0FLظ+M~qA\(%:Q^/<վLdRLш B*҄ˌ(U[^^ Cz`2rծ&Xc9Yí>ˊ\JۇeۀzVyv&-[ͯ/f%Jg+m{`xh|[׸Z΅k A\dWd`L`v.A"W蔛r't>ldHE"qJ@h8WjJK?}+tpc u: fL: o"] 9ފ5g%c%֤~16LբS6dfO?^7M見^|pJp趵iqڜ|/( ڜĿzcFn{u:@/1vW%{C_j'AaMԙ2;Q} t k/}za]E 2!'9 6aLs@\3BpGW3zWt&X}v͠pDآۀTP7 e?2Z,mH ѳ: Ry -fSՠrO} JHDE{$8c-.PTNh;\ uCspUNNj=) >lXYHbiiIm3-yw@\ c4лH %+'5*=#=Zs'S"թ^kƜ.cҢhnA/]KkVfwD_ܫVy~+vvxU BkB~ݜaU<\y[s`MEWP;P+"zSsj^EVsix_>s7u x*JϬ_]-^5HV3N{hG41Oc0LuH 0M[=%.iBϲO('G&5ڈ`'c? +!n\- QxT6Pz,r>k- {n@d=G0i6WLȂ<ſ˫yAqNP_$M4A@*$"4 k {&{sYTՅr@ غm2Gg8xSx)ɾ":qUbfȼ->Qexeww|z-iqEQ)K3ɔ I,2XUg;X)SxٗہN|);,nˬbIܷbXdCp>yXG[PϽ!e50OTe~X`Y:f ߖ>?x&pA N-epVGaEa}b5z qBT 6ϚB]8&UW^A`Һ617z z79 d[=cֳ_zF)Μ{q3sya/yw9]:XD/X;s7KLuRs,5/~Xn-$}skn,Y!yZ_I8R{NΦ1: ]qVP[Ƥ^pB|3m5!)ؒ=1Vѻ6IE#D’ IjݠDP! 9ԎZ!!:22D@Sv~<W}5RKcڨєF+!B*Ҭ+rRa l@_/thphwx-WZ~5nsMd_]v HQT{'8y͟ kBE((*P-"xWFDa0FQqfplAf;8 hW<̐²)dT7C&sWm^H.hp"M1Dh1Eq$Q&:+(Xz`T R-mȩ_eL}gvoTHqzK {SxtzQ{(4 ً֬O,R HAF2F'D1NyD@S!h4՞[RXW (%O$X#.C1uC$4! +gKEuD!..eʐ&{6_OnڜT'0:sH |@ڬ9iCC%] ަA1z{'<<0CwN?]~uwQP`:qsmAvGW#Nojmݹdotcg I}qz$ HdQtΨteo?MM[D'A~R1&;Gq@*A18i@x~QIB?[s!4)[緞w8rAB-c6; UqZ,:j %vN{:xSCAǬt}v}9} wl70W}^tB,NB՛a$V34:j^LNrMHH/BE=9gNQqُ鑃읨4`ڒ?F:u;{Xtkk}1#Dnֹoyjč,ƘzϾ @k`F@y@=Ru>җJxtc̙c iE{ 4ӥmK嫉ċ;3Q:TKRx$.h}|]$HJv[aų=r2osgq}ۯ_n[4oEs߬6F}kzc2ˠMO"wc2%tS6$`DIMY&Sbbe"QhF86XEZ-je(-rb&Ym3iɇM]O k_o~I'flלl>¢g6?^>5|LdR4ESfee!SSHEM WİD@ Q@l,8ʹ9u1L # UjY>.H>.~*qxbY=*K\4Qo*Z8^|)2͑:N +N骹3\&HbI$H63v#Dʵ KRwmP3m˥܏kP}֝v+p\safW%+*cz(p,NTbk5R)* iɟ9J2qm@D#.@TB'QBg:6B:\[9(.HQ4]x b>Т/?ᛷq_R;{\ŃrS"MET4ȥ~>uO h\/nPP ##F>bQGtBW9SMx̒q{*v}w 1H%DF(a)O/>"I4;TbF'b)P? a&C֘fm4jOxi\ {-@fEi ߍCj1aNW i,FB%04kJt۝Ĉ7B z7dJRFC4K4Τ >3SJrO5]QtŘb "_\zv TjLgY*ew2KY3~ o Зi?)%L& ty4/E`bXQ!hk b`:ÐZe&4 L2kj|2kP#et%z1΋ߖPJܕS].}E顩 @n4 齣z 7VdzR=iP0Qt>*eMGr^q}\T7+P,VKD^)\?n]m_x߇]ӣz9Y^|]=nR5GbVP|SA88QHF0rB2(83ٻ6$W=,0 a ;m}5,S3!%x7(Kţ,m$T}yDDK'z)t9&(rGic& 0Md8 f:f8ːcq)&ǐX70\$Laj̧XOQfl"HJABY,\CRlM(Mefr88 /+asesXrx,u05l.ApL#8TT [1w[G9HtzQ 6 -uEFLǽhkQ8[]~ޚR9"ͱ\a!(Rk@:aL[Ārʢ 8* bQb`S *=˃'0ҔZ~/:koc4N⧠⤵FgU/%T:@Q븳w Z&~ZkǬXIitbjObwBÛ]UrqXU |'U襔D){VA8v)5e^\*%∉F1X z!4i^:@XTsEU^"*"#L;RA0gDZ԰Lzy*4P-dbmWO"JcZNSb+BTkiҌ`cI͵BAܣE)k:n8#k֓?]L.BLe^E5,kLu]C#BRx̥R(!SkL*32>!) ?cs5E(}7asI\G.f/*.g cz̑NfQOg)Ͱ΀ipo g 7?'L:V#`kp^S+M91uG‰eG~)dhN9>4e>bD~gބ*0N拻i}<Nu匋;s?}uq,f㣯vy_^i49Uk ϼfaCeFpXDʲLx9/ȃc7mz/~FՂ N2HnʨE6G=p IW'W (CP_oz3ZddZzv*S%$#Ʈ# ʛY*ڐOꛇM?V}APC`#Srϡ=fհc|5~B"Sb:/|w}488(D&KsE"V.sгKBNȇ*ȷg%w ZKf>FdQ>N^%r,UֿШI߀UےgI/._?Q}5d$Dʆ7׸A)+u\ (T(nZq Qu%% Dap sJ #(;EMLxhxT&G !A~"box8[(v'Q~EEs&S%$)(3FE72 첍p"W|?_EDjP"U\*Za^Mmnx5Zz3=mԡc]±7 '*hEJͨ)F7 * 17ݻ.46<|[.q^Q^-j2t *=Q0 ZA5"L ѥz\Dʾ2V`CBݩxd5Ѿ{WsZ#ʹî摬CBoEqS·1q$]TߕOCSZ7aɱB5Ǔ2C0RE^9S]>_rKzGybnj|' h=4ꤜ @lm= hͣp @Z!O#pW ox =S:GTo~)=h|'U2cl"Cq˜GqY(\ԯ?AhA#RH6 Ć8F"&%ᣯuT抍,, U2?ҭ*bjE\u%eaShUzjupQpzDHZ]ywETj!4Aov4)v5$^z5TlTOPVl&SN[kTd d nqVd;*.a@"X8,&ϓ\a~ *Ӭ̅yWA‡]M0$X^qSNp *7焂SKx e$Hm6o mK$YsUoɬs4$K&0ՠKW4#-dINޚF- Y,zo_Mk%5cXSDx1;=ܿ4th0 m/`_앸^)X5L& (>(` 2֠}UU]QV5+x@m=Dlաnt+=mt3 C5m ffn]C+. ,g~vEZld/qr#h.>rnvy_k=Ҩc5g#2Z;[W`>{xay{KyP0D@#[fktΗ}\C:8;lv?!7˵> =F0Kw>L=xH]hÔ`5n&tRv_9G* 38m3zH)ek l˸ 6Z!zx){;.2thR1R w\9;dB5 V1ͫn뭢Imz N!~!ҽML$:4`[Rᒬ܄F֨i<:l0Smj1>{x/*8 ƭ(xRUEtϿ'Ƙ7Շ1!n Y#l>ݘ[\f=G}7= Ed-7X*n`2_oo(g;x#xӛ, z#עV,)eb[If1-:t-Hu'Cc4c\ G@G3s$:fmsDd&I"4$ "·cwy NrxXQ $Zt$#w3 9bph=n5B# (CJBuG t@I ;؎!bK[Tٵ|2Gu[m1|?!1[G=3P]u!P)ij& W'hL;!7pGfUhS ;l!Iw}F͐g|T "/U\~Q̬FLD0J.&9m$±g{ 0$(eliwN9N?;]]9> $1U;c-c/y\hՐFNcS|X%i%- J[wm]9o[tM Z7;wgZE ɞR<#[מv"λj|>lċh|B' uި :*>P'hE1㤦!/XpqK}|ΊնܓA97^^?wz~\go}qӿ/;Y&'ӿ׽v$_zWAd~ԻrN7r(\im%-&RoۓZ w'e>ZMy-'ŋnyW7!tt;M{4h =I:0_xY.Aoҋu?}Tb8{}˫g_xm v.N|^/߽yYw8}iw<;?%_ogzֲ:)뷧~8))z+w8dH}u"o $SR9{i^:qh?c~jc9i?[JԼtz%3MokÆ·n2#DS}硰6 C_|ӷg[8)iv;lN7٫7Gp2|8H3S‡tA{_z>*;L Uc,`hLgd"3ÇJǥuo vh@<*h<:wdn#-޶gRzc;]j9?:}&]p8>S6 u"NֹlzTx4~TʯXIK"7:6@奊11D˩H!A)?RR7>mDx k3'm2 vlK&V`lg3t!HĦY}nGb$XLuX01AX9y`NZPkԤ-Tlی nm/&{V"-XmH/O ekh1B\c.k٪WX׈X+KVzQy)b 1L,Kn3/; ›k͊{?q2f< )~: K|yۿ *jyJwm:;in U1LUԍ+p9CZՠ}@$*q o!@0bƵ(@㚣(jP| iX𦵅1]Z\mPR=nIU!d`rʡQlۢQ~\oaQfvjE! w{wdQ#*h0MDz%E;G[{s7{_?CU1'+.HXd!#*Y W6Nsl.pJ3pjgIᒒB$w.sv-=iÙ Wp"A@u1 Qp6R+߾w$/T0olf@^ ]ؼBB-v9׆ qjl-6! q!fVNN@rNOvNwd& ߹E:qgRvg(e86?ɧ[X]*E "뎔>Rav@pbV&um} X髫jDO)lq^dm 楌%D;"K%kx 6VXZ=dk&* 5ǥAJ#rrFw21g W_ua۝MdQG*!Fꅬz"8 "pBA'3b(,ߏl{\zfI7T GZWp@@EgjícV`(cS%N?DNFcٳ98 W Jp3c-m(qWt\.Ǒ;"͹ 6uϳZ3x`>nSu8rZ-wonwlGqc#}u<'\읙Oj6JB PUfyBh~gӧo ш2ڕS@`VYN3i"c`25,60G,EL d Z@}] ؆!/~1“0@m}غ;6j^QGXCPHlA:DFH""Cpl0lJ } tٜ޽6Eӻ*} el\r,t(#cLiÅ&DRAƢwplKTBK+}z'1Ʊ [ĂAFX )cƒilDŽ3Pv;c61lQx, >Q@ajq" 2B(ƁbڑջD1C4P ֔1Cɣ`|>mo=[{UmF:{c㨟>4DO.2$'O֤qv',EOm6PHW˭nC) #i8B+=j2 =f&C~Y$id#E^O%).$8'fݭmJu??:+spױǴn;x. .ՠH D"bHHb׭V#D=z(CE᠉a"ڨ?U,K6tlʛxHK.I꿃)XS԰'a_er`I^ڈ(FDC 8hHbH"MEֽ{IbTe$QB 2زCl0^Jڸ"WPD aZaTսaPUQJPRPT@$L|>cq(.xC/TSy?sMɡ\~F*g;DǚJ Pp91Z`aF1(bRF(`d[n1I5*`RP)Bdd3qIf 0m8$vr>-g(R іRBŐ1pN" u,- c B+d>䜇HE,: &Oj&CEYh熠(0#0h"lђUllHy lh{Bfȋ}~o%v1Es2"r~5vpL#%RDGR4i1" <,:,Y!(P5 9܀=b 'vgdN6sv5e$ ĴzWKnk&*&3s2$+ V^ۼA#`T!@FЎ"pƔqa4 I4E * .tgh!' -C"LME C\WcY Mb|&q7j/N@']t}jvHLfS< .D_p$}D(!e!ZJBf,!2CD0v *± I (.4@weH~ΦG P`wх>f_vaҴ:AGZJY)g,]]h)2`0"H~>7_o[Fvw͈f:*3%7lh):qC*0}#)L!Kgl8ީD>u"24jG PٜLhe%徔QP Dgyi1u?I𢳇x^ī9Uc &|^bbzO9rMgUj6QVpj|mICFJYlJ3?y^ZGX=zɳ DSd#Q]T^U]#_/ +LUViLQrL2fLNAr֢bG(gYC"7, bB~p1r1 0+>V^ ZQ=zT iȑ+9⏭Q(ȞCC_6b^;v{]bc{$ZSR`澙$ }|4 #(3&㼈TfpO8{6iYKitlh8%qtdKӑ\k%em5 }l#-"6̼7| PQaDn Kn5-VX.%;RQk)*n(&>*N8VǂhǓ#E/oųRͧfZ)bQPVY*wou?G2jZ?X(f>F>|z8; !mN`Iy6i 7k\u(<6bN: X=rF.rI 6H2嘟 Yz\9&l'pɱHu`AJb2%w$δ 4PLAi0; .r=$&0`)GM{@WydLl\CMhLڜ$ydX9\@E{] r{ؼ^Y=(}|nd-}ksݱ,B8qV WSp)Pq+SbЃM=qhC2 `|u)Vv{=864*hVmg8{¸7|53pqoW_YZ뛼^rV;Ll%ȗd~E(PgXؤ Bk( ~lXjqr)Ycpr;Mgc(_BYkه-{WU̖] 6hZԚ -`I2ػ0, i3MzV5xplh0|)'ݡY^n=b֢Pķ(!V7ʉQTb[yQUX+9t̞&b) }SВ|Ny OswYJtV"zT'Ci@H YB.y`hE5\׎:u& Q7nՖl6{k743zNm (4wjPC1Ԏmn1-e6 `ze'ذI+۸0Ŧ)MSo-o[Dǿu0=tq=V&Pܴm˕.=qjZϐ[f Sm~jaj/j@K4 q=֚5wd=&ﯾ\6;[ǛY,:Jϖvw:r>|_.>Z'>Z|ww~COvPUXcH=}xj~z l)ޣҳF5X>F.UwRE{ՐZX' k U ~>xBwϼ:ߵlY{ܰؖm@GW˫8}8n>ߎX]ĶCYw2  Rt~[ (nvF![(p%t4BJ[LI;RPUm#\q !-D. 8oO+yBGj^t"uDM?!_L:x(ޏƗK>AM֣:]3A.Mi6,nyXӷ-tnދɝIB=[JB)nMO#Qh=U 0Y+?HhE뛳~;.>'_3>&G[InraqK̊鄹h>]k;%{ןttO/WWouH yP}]:f'HQvFiJ]Df WeSk+qbkpl5RSo#l9:9,rVAգP>Ţ'g\Rr7l)R$JpJ)tyV٬W/Þ< o^O I7]sioSlƕF:]&:L({VX \e -X^~빾G+ 㰕;nNVv9Sަww-5CȍF rHAwza5q._]GJ+ؠ9&;zJ3qb]!?kL Qk߅\E fٻn$WX~#\'G♝-x+}C<"I I*DAuh4QI _]8yKDs_Ű$[eBn2 9G, Bq.h6"g$DAe0KaIƭA,8e1"Ts'POa 0=mס 6J)S@00uyDӎ3㯻.1FER.1SlfU&* $nDD˔5.'\p7IWOΦ8/~Z L ^P6E&h1 *t[;EVJNAҧwxL6&G<\A: HDž\gK^<-' y[M5x4s:g`B",(c<K@5UpATM@7<:G #ٯʛxC$8:6B<``SqZIߎȖps)B*eK\/^aāu>t|?$FOh4Y㷕"L$Ʌۻ/mLu@g*qj-sCEp&8Q0+ٹNDrVn+*Br'5qsHG#29yo\_';Y$9I˲+va5佤 g -8B;ke-z`^z1ou izPtPB@0 `Z(=JXZ7ZNFG#UΤm)BUfi C=EC}iFd?qjnG],.KnqZI<~{yw<o2I3)W($k飉gqSƤ̸+EϞoV+y7R7V+ФsDwBׄZ%\ Va|l;8g#gRy* 0sĴZss1.mfc,"-9~g-Y^&ϋ\&j[dAF'EKrT5P=cV g&k+4\<9qa8]f΋CcN.U ΩCsK2FYi. (Ԙ,4$Ӻ,Ԉ[pO:2rB1XK>Sh\KoJ4Mkֳ͆k5JeZ9g`ŒN4Բio0^QV@k vӶW:wRj5u6{ Ts{}$|{|s.ӋUלoUܯy۸_# po>|9^i119,Yq^ͅ"__gac\ow;NTDYT3s RnI_MosDlme5ν\1BD"83K&0SC"l9J)[vjfe2]!Uşv[wr< AQuƋl&'^d,2h-4oat8y\4?p#><ǖ㞀I WSk |&oh5p+0&?!:](x{4&7ioӏ;26J2r! \?OcnSt} O՗qcNv]~~0}UvK+[Cc~"ă(a?l|;IS΢1dL纭8R<#_ÊJSb<@S%ӿH9 9sROqBG'te-;A&M~+ ^VIέ6>h]Z nA J 64VksUHuzOB+>nA[2qwb]$ AvG[8SWOxl](*C`.B:FTGec<127T898a@\D07LRiVi{)Xu& 6DpXWL1#r ^%6\I(F< 0q 󀔌90*j >h=/)3\`~Bx<TZ*1x3 Ӓdlk`#t`5 bz)hs DvcC$)9h)ij;0%*? Q5p9S<`t6JR岺#G.9rtA,+xq|A\)䭒B (#CrGT ٝO|i/!9.ph^* [.Pr> |y_C.XceTnGMbɗN|`YcrLMSb eqe /[ӂcΗج<}#W*mI?!7?=ђb"]/?߽!@hGF70Oe +@)]{tM8P<%}ֳߖIwWbH) ʸ.xo:3&< }ӾTΫ{ u3!WyTMM7|&J}ÙKy nܯsբ܆8n8!ܯLs;Ԕ C%ދ]O7?~OB|,ᷛ?"SrɻWI6 !_$ +*Ӆ5jNrqt+J%pM_1į~-}?i LkcZkxCI V`c)nÀRbNI"HԀ"3$qFJPvt3F1 {J`ve(LjESTnѢe^9ʠ_صhжf@ y49-eÑ7Nw\cs:LVǯh0,{(F~_lc9O%,Y48ax&D9)xEJGa!"vB.Zo80u3P*Xp 8DHb$(MnZZc^Ax &T^łvRaGQ%F.b6QAq+DE6Q#DUx/a{տ%͢h/H. ǴHfsYӄ`(08a<$(EVhE:-ӑ l"mP)jwDQ<I|#q[l+)a,>WpT`& BmH$8Y$V 0 5}οCm ɻ[nD&"m#9RZ'r-GM dWsKRn:σ'3AW8nMS"L+qj7`c< QMk]a)U.<ʅee-uY%qv’*s!.ސae|pJɏ`t|ĩ0_WfGdY 5y/f2Oҗ2[[S ˪56&2'h;vS rAϏ Ҁt; `F)^Ö=t K*EmFVN#+븠k,wq֐G]{ۥC9 xq?A'hxo<>;yLΑQ${'r@D8i3^(-]SvO>1ήk` G۸JM ^D|Ȣf%.&,\r (Aeؓ/] AJDB ǟF5Zgbٵ|\]呎Ulh! ᮸ 3t,#fdlؒEh=?WF0c1%Њ r~N?jsO?ж 6k7Q-fb]"-7V %WwnF'CW:e#9pb0r*cl1rLآS ·Z?|. ȒnBP,Dwz2%7)RU@uLW]@=q?̿ntVn28P {W&Tvc )R? )7M]^OWކ=g@-ZgtUE`4vg E4#u3oJl}ʻi5:(ޘmf-^ #K[0żc`ު)R3U]X]]^AM:!RD&ٱ̶")*U)f5Ll?BYp'uf OǙ?:O FEX !dt8}|6t)֍} $6Eo%)Rz<~Eb5l9'[wE-!ۆXQ)>!$TJ2\1x6|Elĝ_?Up1GVm$̄Sv߼ \֥MČvG)[A"NmQÑ 1,mx%PKL{Ʒyt+쨻N. m֕G޴ۋT;n+cBZLhוZ?!.ºX<" %Ǩr W;<=@q5yP*zYQ4@Ù9'`Xf)O:a z>;G) q6Ɯ 5™qxۉ-;!5"P ڭ]N#N/>-K+.uic+ Ȃ9+/1v%4#j?YizmYj}xL{][rTB+^oà{ %KEWu Fjfw!Avʉ8ם_O y>g=הIB.Z]y-_:RACU!s3} 0kԼ f(JdE<'SQ4Jow4P95tmg˽䇋ޜ$^<'oJ:6c Z{oMdxh>8 ϯ>Ab//찛 ._|JV lS̽ORaImjEGf<Ӓ9 lD|O%??\z55ނzp)ǙFk97Pp/?<8W0CХxL7T-1*x,!J 忍@ho.TVT׎p/:/ ,d~7@-Iyswq2hDkE05Sc4i-AuGgjg>?uB*PDzq0aO~>%d\j8TgȲqH:fbXh*e)ω) ˖5DB̔ejbq?+2hm06JnI1%m֊^YzޙRe@: #LR, a%[ζ?1U]2m%012 ߵo4׆ٹNm0 Q[8odOv(FCHo_eGwwwVEMok-UJk'ZtQ!7O^*.ǑN 9Kqf3.~w/`on.w&u0Gy BziM4M6ҷڃsn9juXWx9*3ʘM-Fgك=PR7yTMyL 4 cky i+1BB:Oa܁?o ~O]뿁^vǻV+*[@lG~Z`EqxUҘշT8QJz3i㙏BbEC@"iNg~p19{p͏`F0?^bCkx{c>pcë|4}c-n?ߠk&yΝ.+K/@/gؤ.3' KFp `*D*!?*4B+sl4vH56#{ɘhrnCZ1g)Wy _e>nt'hOA ͌V%h?MN;~/Yj+rYҔ"V /L}~3ai41)9 >ͼdža(HzTw@>#>edq)ΖCt iͺvI :x X A@I$+2"7JQNVYj$̢P`-8'x/̶b(.4h<>?6_znUȾdokQͼ$Z@U+By᩠!̄ ,łqA >1)Q!0ZY$bL(k!;©)"#ZAS+LyOWA@'/j1  Gv4̿Sh9w: Uxh(x`C05I}>e0m&g97!g&L܄b ز)AssxR8${ah( ?|WFL*샮; )s E02?3˚J?<^>ȇ`X7wcyx֓&$ =nBObOï 10*ɘv , 6 aP"I "'ԥSch9.Rr\1&\=Lafin:z Fyyd><s6F`a^_|' bĝ, 6%O0ͽIvy޽ߏMy^x` l '/߹1/#$)S{26JE^j9.t ‘wQ2)kq&Ռkhu _EeypM8+|>O|UV |¡{ל)Qg8)0\/lLի5gV-}m:{?Uw;"`%$P/^X=W?1A}8ؔs4c֏(? 8%R4b G*͹S! [rދ-ǭRXky]^–c-seP$VKRS<.)qm6NL{p)SZdc @+Bs9`(C||u闱+L˘glL_5֭հ sAj&z8kZs$5J8INJOw~ד_tkmSS:xsNJ}vӓvuHS4uZ`+N-{lԒxT[?x ?4;?.~Eӆg:Q!ܸ:⭻b=JJDjJ#,.MƚR^5k-e) OƱd7i1&)[U&gQV0?amGAUONKnsʌ:䵨F=)p](Odt԰P%#.vۓ~mVt]-{C]xJԩړ+.pm 6.4O{అD՛]/|Ukń@HDnDUVmxk]oW±/} j8$ x{`9ԘdWIz BqQyzuia)ɵ'fB2`,P„7%Xh+iӘbvQvIAh~i[/=cQo:n\&gGjQC\`u3v~SQx!>~†g4yV F?>eadvTe)Qs7-EߠmpּooɐX$e55Pߨ@[=Z@]emV#Œ V6hH&ݛw1f7o,[.QsҎM!E9`H0% C%L[lD 5"N SnB픛P;X;؈r-/VV1i55PKaIƆZzzcа(ijDZ м'BuSAί>n U̸UםC@D7S3} 84~~0)0"k"~LWS5!ke p"s!L'1ܣ9] /M6 kJn)yQnʻm$"*]C1}mT+~o2ml DP ^ݛ3,@o#?<.S#&x]_Ǧa\vok֠r3ۗ=•%Eu3BlHh$QEDHZTZZ-=bN3țY}pf>OZm#GE^|-0, /|Mc[>Ib[ܲ$[")b$f**ON<{nR+i=9ݱœBavO䓘==oGPF&{^ξ>s8 +AL }p{G~O*U,#zT#wV^T҇ `38@Og~=$rKkL`æN-Ъ/cVAH L+e\ntLG)#>ȇx3Tw+=Z)Wr^g+⒂.YC6i#B (Y:M*cksCN\.z8EIRT1-{rPG֛۠ x,?QLggE5"ε~|xUwX}3h A8Ӓ?1Th%<]A8TK7@KQ`³j묓(ɒya-; U Lc15#cҨ9^t8:kFw_d!ZhkV"v$pD>DRuzE{u\B轢{Bk/Ўy6t#$=A}{@i.K !A.9H( K3 'CEƄHБ0$Rh`YzZBzАe"kF˷K-iւ2{QP)+)E$xkhLK2h?\GU1})MGjR"SW;9rg>{PÕw?-5yFЎ#ҰeNp)q+k$zjx HJ:l6ZSXpcʝ 47ʐfMaө@#$lCX@ϖﳃFP A@ Ts[5_A6&B&.AgsJdTs D듎`fId4J \5e| J4i*H.Dr AJd[Y-k)pl-?2&;eJ"`r%<.2\˯+5?HWΣYA  pQO+㰛UL0olJ: c9,m&%-"VKoyYmrN;hfwaybnG 3"J 0lNSI> Ȁj^>$uYg RySw)Pgv>RBj,&'K4OQ}Nڀ|~O7m47uEx8 D[WHx[./g)IQ'[IMwnX+$R꫕sݣa%}~m/PZNqIʇ8]d'G\}NSTӔNє܀O'=g!70uwW]O '}PC!VHc?$o{uް'U yB5'$ʉ4 VU;/on5y/W3^s` ǗWXaj9(~/S5q RqzjvFsnW7mq>j1@Vaexx%zW<s*{De«L8z D_+džP6W>g?ݭ/o}+42o4._@t Ω >#Dw-ǭuإչ%Bv 1j֐ ZRC̞䕰dS*zh}[ 髝eoZYhk}{$gw^HJޚEh##24 ҈ "=8k`*.-Z4H51-"ZSVlkjղ" JSc4ci|~t̫y[_ꓷ}򚱇U]<oy\̷zT曣^77A<_Yn "tۤ'dմ?U1ڱZ.cLosjhx<{lp_2Ftly798GkVS{ba>FFzp^'%v.P隣}ZF>~>8_<3Pڬ/w(^qVVhMw7k 1Pq( _Okӥ v~'&֘Wp(o NAn$%|E^^ $|ZAϣ.-KX!_o?9C㜌Jm2kqmttL)`92]VSF@m7.t.":  |ϔ1H[$3'L+6{sd2.r/:},= f+of;G/y[QMˇԖ*ܕ|/?s~Ε;_n ʈAhVhʄ&b5槹kN  AhT`.wԱ ,R$Q)\:x$Pgʑ@W@. |ػz@=Gx4{AP }Tpcr2Jڟx)W.\Ki&.UgE Ã|5aZlҽSkokh$݀,.ڧz2:emQ1{=.# 5\$bita aZ, 8\ H%0I\AHBpO/1#A{. &2"1UBO:puw haM. Zxv% 5.r5!5.,NK,bgKT8ʞϒ8)2 "x.gӦ-jM=0Ewi[PI%]fQ{bv.o4l:iHvkm9XZ //āR j£{@M,$6Jl$2lX7Ѭ(SY6EꂓAQaejIDI_%84rUV-'+x,rmA '8doӶlI{=Y:baކI0AGdA{!3 dତFyNѡ?r)4ݠ̗Z}xwLQmizBt;_F_O9il5KF4 &ϧ򰗷B*v\5,4v1 >ю J _ H-FbNTS CZ7.! ݟ$ .$e3i3 MI߽j rʴ]]FϛkQZHSsƹ_20Jh첗^~.XRvp:kVe.f;]TV>Ր/N&w~^9B qt1xcN8_dSŤO%f 4m1p+9ZTkK!(Tjb0fF *ڐ'>V/8+ΝsȖYQtfϊ*]q^͟_:]M٫l?/Sxpŧҝ8Ko_>?<Ŀa> kssȟ":}( W~7g`fb+_.߹5_V+yX=<%[R؋oO> ?'H?[X7F1˜teTBsL0}6_f(A{E +<p/vX:/p.+k08mb%1L*/ !iH!Aq1B3UP =m3 C6J( #NSM Qt9@*O\`g0J1_Gku($H!,L~W$L57 (_clCS0.$Sec2:^W>}0XE<%`UX.po `Z($,/[(*BYSFa#Ja"x#9/Bz8 qI00a4G$Bwi8ŕNaI*S $0N_~)1wuMĄ0%grBN9W YM~k5]oW)Cd!ϵ 8Q#jGVf$Иb%¢̐ w6" k2J0kF 9Τ XG0Txp:T+X哙j hi]BN~Iv%:mBFKƭKƲ`W26FV="jfEw=tS럳W\Zs{Sּ|z AMk^UZsJO oCKwbN u!w^"U:\=hk >7޿oȕHVi<ơՊYWˋ'װoM#OL2,#S^ᔕQГ\YuvUx8rC#2ݕ;Jta]384l>$w+$--\nkԈ# :GHњK{n3糽)iTPtI;N<_7`G*V0e!⩶NS*`M%%i#5,uJk'xbVK(ͱȊѝ6৛>,IϞEX"r bK0jqTQS*  ߠ1(@PQr]qCX3 Zc btF{,K-TG(\3o5Csϣ&SSRug 4ע%CSK,Frsq:XT0덆ᝁ sŏ@LXm+ɶԙbs*5PWJ/9MHk$ CI1Mq,)khb F8I.m\[LPL1tnjz#w7yNsZ;8Lf eiOQ=[1 bbS'6E穎ʒ~{h}PqWfj/3㖀r{|P1^}Oss~K_܎YSK͓L^݅72%U|DO ѣCuVe,t Ju0;Wk^tPIXPVј%]t2d`zxT-b3kU#ySȢ#3Vܓctt| +7C]:ȭ|̝xw2B \Oߙc?7ǯ)z{zy4mZ ޵ y?+=m)Au>88*~.(wwrZ!1S ;|.#1{s=(Rdr"Ѳrt}zt><bMI~x.cx]iX]Y50{8K=?Rgs9:_*7k`y__?}sg{v97v2wF5 0_< Mj %Ҿ[[nBe4N4gDڀ~L$\b.rS ֚Ԇ-' X6R@X f̯f=/~y oΔ)D%sŤ4$Zۖ5"_, upG*YX ߶|zÜu1r& Xw670荰'X L5D^_GURwar3=AS+,QYT:n9e(ʅYE$Q( 3^@, 2" ]2c)>W;E}qFTCJ:p2\cUpaqЕ&Eap_5cJJXF%1t8ctQ3̿&",:bYF)tsA,IHb']`f'-.Xp4_59o-FG~M4*9CrLr#*U& ?\=fs#-V!mli =^ xT{SǍ-zݓ/{53Y6+Bxo^7[f>T#RÕwwҼNNo珼vkDj]}A3fWLw9sr2́{CH:",?L7[>=[ƴ T 2oa="a%X!GwG=e\[#*5:rI2UHL&(Vrpm <t뮡C N!RzEAbzjZ/+"#^1!0mzC11:"ψ2U<:G.jepqCC DBDc$\"fURHPe}KBpL*"f ZQ6" nIUԻ oY^Բ٧CaK)U;PJ9 !vHw(,M(ǘM\X\V )=@(扰 4I%G8}r 'Ut@ zҸː`մxꬌ yM/ړ~;p7y@e&Y;k//.tgWzWǵZH+O0u;4jL9<ʨg1x}öZ+)9Mˉ>9*i; HN}n NC 9t,mԼ㔔GLNd"'2"sP4eڭQTfhzas Ybd4+Ae{Ucwwc=^?deQayˡ^#/쇇=܁BT{ 8\L9s19#d=Mi-XO_UcWi]1vUcUSBBh ^q 9 ~0))c:nwm~ y `af'9KeH9[b/jX,C/_ c{Z-'r.bp,1F9VZqa3*G,.s"5lbK%$zܙ"^ZZ)ӈc),8x- ;''ڣ50l>U",cW*=Pq%))N'>DS0iUb%4J cIx!LKR^qN,CxQ)WV=HWx<}NLZ¢DQqRH܈8Y \}Q34,1"4Pn:>dQ{< .c3y688JBsq4T%YKn%ԮM/<V71z.6hr)}wD*%+BB(dCQLX|'uK 6^(ljJ$!]b*B(nѥҳi\l9#u^1-yinp|Tq8'@*-WIL``ǘ#vZB\s)Ї$%⥓kQ}5Stqס1M,79ס-0,ZJt qB2/[z=Fȩ(2naOh)c쁋W߇DDЖx}bBR3 H3=i!ywa)] 7w-dݤ£ZHxn7a#A0DbEM>qL{},bFVQ MՌ 5ZN8濍"9ZXG #լ+X~QEhxJ|<%(aeAGLFf\[T[lyl*Di+ Ƈj:䆙=Nx%e ˂b!c@%rl SD:Fy rހۨJ{M76^F4 Pi9zaD|w_*9tW(N;HVR_!Gkm6UϳΛkmz%0+ޤaOVj+KŧS+e_[?aDUoh>(=<":C⨌sM_ Zml]pl)I.ae[%f vtGTe\ڏ/Hy0y(JH5֏hKsf\ۊJPқǯtsSTa%uE$9;W-*!re.QWNk'iKL-iiMu~µF x-leBSyhWY{Oq' 3:BQ-CRh)EP$3saη#G!33Z)P?Oɒ &X7-'la`yYe3砺+!^ S(yAJ'l$agc2BZ4a` {Ra[> >bY{`;ݿN@>G!g`8|1~ܨA6g%|Fx{6{-cR[k+D~Xژ"F Q*-(*WtFiV,~,oSmF^-$fݣ(ߔ{G&MkǢ婢sdPbF~2dbSPLS/.^4;\:ᔝC9 O*gڠF MJy<jbKsD''5$*HqIb) 1|,dR$Fg,0ܢ ci/P).wW8 E^x''ϴ3F:iw${Ep!/g4o}!H?zGMkEE\@=ۈğ@Tˊ]Mb)P /+$H"B;۽~Աl"EX 'd>Mr$i}=zxcH?t6}:ɤ!LޱFQ2̘i< JN4U+!*J8{>.ჿ \_c!b^;3˧f 'WoxF ?4ǹwT~Rȳ+Xaqs5(c^Z\eK^(/*I0 [p=Y!tJBd.?]h-⨹FmyDf3=ξ$"eX-FiL,/lĝQ R303ў~ZN0* Ra֓easJ;tl3Qy6ɘ6;#B xA k\l A92$xkPs*-wH_-n(*]{(Ȗ y΄B%7*%&8i-c ^X1"L4Gp&8!BLK19 L"0 Q&r6,Sd1+IT[DKardj̼*Q9VY4Leӥ&Tґ9.^Ԋr"3H!5a:Yl qsN &fD\6ly[yc)y|7q7x9~?tU\9Wuq]. *M']or?y3 Gw%F}'GnTRDkIvV%J3zN>xES]`жwޭ MM5n R[] RLnUWS;n_4ջ5a!_OmJHףFCxsunmP\\ X.o2879HA8KBT`8!֣Fu;Uڨn'0?;YA,)ՔSȺ*fNJ5_v<*xlD$VI)2MȄ$tF9D |!J k[E+7* K$,%BrcTx,/uJĎIOq*:N} ,xxÅ'F+0gNX=8EmUDXEf-yţb?-9f >dyfIp ` kg$9m0Ϸ>MY9nbo?JhwRT er[n`?\l?6+|r5>'o~)X(7\ $cӫYťO,9VR@\`o1 q,Ş4W 7ɍۨy:?mO VT!^+=PQ[BY7쵪#I6 \}@)lp(#m㡉C61UE^B ( JFkF Sƨ:4 ȇ8Jh LDGnO;h.LϜq'0҉ɥ Jr[+Y)/"C>W*9ˌ,62j{ojYOvLD$gȹ`-/&E@5ayf,y;Mc Օ>%$ו^>*[{Iz8RJWѨDtUW\zP\yzoXOiThvDURҙù_iijkij̐ ~sm й`.cUue~6B ~2x&f=8_GОޚǹ/3| hΚGB :8(;JbE+j+bimLumoe;۵;_ﺫVqǝnV'Ç#P7Q+%6ώR҅Wi;X \90ee{_ǫ)/ūiƫpm '.=ؽyHxRr3Ks,y? }>דr~ۘʭieC"-H tvprFIzОOK=OSA=뙷~>}N>^>ݹtƬ]OR>ڹ"lRt*<=j=rr|s;IaOݬXek}Zw|= ѯJ«&(BsAA!PEl#٬ޕƑ#Rbv6eˀgmhwlCit=*IY2Y]6`YRV2Ah{pqK\ larS-ЪGp&:"KAy Qb#Hh+Ee!5U 4pU쳺s+`M+?`30aCcDù$E9lǼT0Ev&q'`SP}2{ ۦ9L ]Jkݝ-dy4Sd(F4GKL#&diF.V)k=' Pk:Bs;;@9= hN 0D Q4[,dL#2rt {•@{PV/Gm^U.5~;=ύdl 1   B1p2V7/)[R-O tCOZ6(cvj>(ŠM1ءOmK>VwDнߓv7MkW|;J1htJ費Rjv򫛖[+;_)7W:nN1xךDՒDu*43lM#oo44^}: w"}\#x}Vkm-Zl_EEzW[_~qˋV&+d7w32cX-yG(bF.pF2+T7"6(wAkFH:\xbd0N,v*LU~tmZ jC'gYmѬޅ}ة $JT5]XK9O_{s5Ənzlmi(Gx-)整޸Ir^!D$?î^W[`a͚1Ak3w6q51z=6_n?M;{p3:n.׫ՅڼWy4sR`P!>yH,9a_%|tow<'R0Uw,^m)-[ܿ&c+s[,0nc*hv.|+j7'2#Ԁ§UZLs+B(ajT:WJKhc ,4+ѫYmXoek(DTg<ʅ*r̸FVy08f2Nf+CcۣՒ|^7ދjw8Kh,w8ȤKpDY37y2WP5N[&O%>zьm[p;vt\GБ >83͉M}G#GTc]_96DWc3{OQ{LJY"9M!|mI4V+xebS2q5c,Ғޮ@ϋR#2ɰ8nş"{~Zt-Bz[v8a|glb|UqcǕ8RK$ |"cHiJTθeJ P#Eܒ#G#"0v2ɼ^qR^Xy:,TH 12 ^`d:2>E\.9 cMX`}W7q u<{Dw0Y~jd7>׫1+6pV.__8ۓYW'c9̘Z+贮wuچpp>XfENy^tz# nA8S@[Eb9)hW|K>UL~|W?EYhh¬w1VFN?5E\q%~} K -GIw.q̞ռML6B߿eo#ة>}ztꯔI~*%Twlo>g'O%0 r>zJfv5K|Wت!VljXz>5SC'/zzi%[4c0ç/CN.sZThJVVhي-廑+&7k1r_?{lBxG4!pJZݧ|Kw7{s[O a '߱: M kpypwDuvk•4m+hhWxީ)-vjkE(!ҘSgpt cK'h]HF6˥iJL|ohI?r0-5tQ ƀM~\•6y! 8q&LXPZ95&669[l~h5c|lyd_|&S 7>ܑCKL 0}^7c˹_ {P=Jh+WYQŘLy a)To\zuL'xEU&ǽ6J9 Jg[ INl넴~.*R@D'4P`y:jg$iMLJ!J“1PbZ!7qL w!dMk,ȩ9ZJPfM6 j%Q$ޜP΄ގ}|r.&dTCi\,9˥iifo@P؂7AyGY (rp)/pF)j2?|>LpI|9z[MoF͚Vny ۬ݧ7^Š!/k cړH> 4J4cQaVc?_^49d\nԆlq4Kt86^\}M;xk@pN -Qw5Mk.W{kcȁ whTw[n]lgP KY)k:-tj4A:h M?4@s~4 jvƀWTw<)ڛ7W_.WU»{4}~ʒ7óervt_F=ǜ5460^ֲ[i<+lQyC}xJkY3p5ϱ~9 S4D1h*jrՁ!4i'"W˫01\>z?}o2JӳAi kgXws4{-;]᎖p͟ƀGñJ#>49@3h11mj!P-,A/Y:%)GMT+Kc:`%eXIj*QkKu@]AP^Venv&a [כ꽪e7<E*N@U,PF Y7;&]O|삔LԀR;ցHJ8.:z0zm9mjj1Jw^+F+h%ʉ\Y $a&{ tؕB[Ϣi)1z"lYNhH2h/vA-tcqBT3Dw)s.Nyt)"LppjM[c )j` Tzzf۶W IcSTP?)2C{m%t'cJq 6sX]Жj[`KRäFT`V5,FaQbt?Jw%C66 $ (CWĴNj,(tW5P {T`m.} d s |û_vG HO{#V_ߟ6-A\ꗛ9y)б3Ry0r/xv/mU~ޞ4#>ˌе3bZܡVLd(_t_| X'vLf7ST[q;С (4u_\?\(!(YT l= e.^!nSO5*:*@ e< ,W bU+u $ Aw:R޸0g1mxҊE;n@WŶ("d[I4C m FV tȌj±a TKY_9d8Anc+ެt,P֏PӏԵlCgvbQ 7l,_$/ǃLUOV#.Ӷ& H~h:0t*`4݋'\1g}w>GmpƷGl-ڦ6GۑFYp\zq?!Bu_l86/)Pm zMDzʜzm]\?|(W+] W'/>ۛsb6*lzS̽aI7a%!mhϝn$3dU*7xbvL&)LJ&#HьZg4;ٸXWwW{_p>?75d䷪4v7om(&D Iƛ4Q֥p:E`Lȃ5-YJju JR,  EAbr9oeFY+ H4* @(TY(W `Ѥ^̤d.C lu1:x!B/ `H.#T`,rMBF .S H{"8'M( Tib0˥fD'5@NEeT!`C(4Ff.pO+NgJIl?}A2๗1żb - CHi ROϩSE29H9YMYibCrI2KԴRZ{$U"ha8EIx0RB;R8X\œpY)R,zM9hV{LMZ[$m%3QgzH_!2a ۳ng^!iInc1}3X+fh[b*<"2#0)!$U !\#821ù78F}3"bJnW7٢Yd ڬgeW\["|Hʅ_DS$wVa 2p^˔]ɔ r 9ұ LcBURiC-taa<`C"-в s"fY-7::kCVtUx"9R)5S^-Oki[_3`InML?KB3<-첍T{ Ti[MT{|%v& %17^A .(ؕ9q ybkHa%`4p!*TEΪBР!yyuXZG0kॻS"9XˊѕrT4JyFJk 3f<$<hv¯'vu~ؓ(vp=MƬ#'+[-/b8?ŒaYMxW&|&~]kM3*kMZ>kM9&ݼhAa_Vt/]!a|N` L; I;(ф(VD)*iy.O7FN///4(k UXj5˚%01qJ UhA!:{ϻGgɡeqܓX*C pz@aPZ/JPmڔT>HgeժE:Lowqi)A\NJ{LE[/ARa"-xrMw+9ٴ>AՋ1T`¢ql^0O>ɫOB.kw}$qZ#0qIL;$w~ :̵qJi[$!iV d/b݁ n^+qGj`׏Q~Fblp3 8CjD>sX`H/\ [;5lrBvaDž>fu}sqwy־Qr8ctp{32Ӂ͔W7ӚoormwM^nxǬ!8o-Rn x<U=!g/ o=kXE; OyF}!'r3x3,s\*ffg8`^٠dd')`qx';|&4Z0hk¤Ƴ6a-rڄi[&5 `캻gQcϢDB|[l}c R21PE7ObL`Ѱrah8ZY *+Z9/ٌ2 W\D4$A.r84|/8[Ÿ#4|)vʰ>pʐK{QW0+L퀫t1zkG9F̲ۈ/`R$*䉵Z'c F 486p~Ҕ ̙V2ԽBh31%@ySZhi@B tfa0\ m踔(MX5H0eE$F>#6H WC:8e+$iihnW^IzRLQ${=)AīQqo:+v1piL?H&gM82]AC'kϨ༔7L9fTHĔfcF&xAfj)ERPI"Jk'xɟj&E˦,I_nxǬ!G->r<ÓiM9BKxkʱIV#t$ő!n.A$@xN3QX_4_#CjVl %&|ŵRݻm뾱QTsf~}Y|x3%| }w -? }ۮ5?ݗ2ޝOaQYSΥZm;HRs> 'QzWxЖ!lMݙ/??ǚ2B'\8DSYX%鶨nӇ&f@*Չtf؄Y}̒Xn&_?7ć_\-ntLD-v򞝟IW [)27[j,6#DRT͆[-x _nrbDUՃrGUS #pwB0X+AoyyG5Ӹ F :w1[|jw ,{P&Mi>tΈL2!#-^ouݡ$;5\k][OS#6<[6=$:Zp^់ r|NtqY~)56\D[˻ju~WǨ8)| . ɒ $pRwE Pοx l7hwdxۍ*r_]ߝ,p=hHCw0c^BG'OHdS.Zű; ͭ _qoTqk&<yE# JK2hUT#LjX pŹ Vr;+t j4:kΐ 4HCjXE'A#XkJa8 0 ^$HJ7XqR)ۻ“kwRG.0XRرA.L Q$5TX%x:Z"A#>=S@U"LBܪ҆ S*~<=Rc$d뛻_=Ri,K}sa;iw{=i+L8į1bG3C=@2$u.R!2níUM0RO%%C̲$,!wg) (y]0%P29LN nʂ_S͠K<ԸnW{ Nb\cЯ?)E.ux]"^^>!t!+ 'L@&{ɤ:wo_QO%o[ZݔvlEoqο_/8 S*ma]Wyݗ~y~\Ι!Ę ê8Q%u6:AUZSX{aie2 <Dg?ʉL$.ib2ÀHw,N|g(+(t-R8;_zox EF⮩"i{y8x,?%.%X OuX̆^Ȯ_ixT֯)P >4;oX)FzSPƽ/)Cyr P$+ 8ɉ _w`rÏCzƐr2arMATM&I 3"r(! =v׈ОQs^Kⱔ!1$ 0rkVzHV#᷌]#j畜hNby _ mh |(-,j@]RWȀ ALS1YܲQrM<DO\Qp̦/ermy`B 0V( i8x:&aYBFyx`Z$ Y 0`?wo{@^*bvLЅaYMxz}{doX@, `Y|#hҞiIŇJ{0:mp4-0%5>@sᶍyXuDKYNegAYb[XJd>P:Kl:%E%ai 2ϐc1 ˄rm`~tk es^[Ä!PNJFUIrsz?.'`7 唳rPo6.q{ s{^0|VQjf7` M(?GIKOZ`oL,ba!:ڌjd.c'(fIB-(K7.4 9ʔ+l~FB;(YFB %vcaYj7QIb!z<Af & QFZwqH4尻mIXM02 C3 ߯n)zI$nY*>U,YeLFuhlJ)@Ao9)  9ElKox9sJ c;De%υ^f C%(1M0O $B\5&SiV>@WS]ݠ6פ`gߺ\3{ڇ5g "' n[E?r}v䜰yc_'nm?ˋOg')Wff}N. 5X?_^^}yM=OϾR )y_!8KHQB.g{Y*PHT\Zr)q*$%.4sݶ%Pn /I^*rgG׆0BS_T$N AK CS^P.DW_3 g{^ҹxߖĮ$e5%'vL53 dt=Ա6:3ksbd-P7]I]}W>&(Os/.JX7a!nn%/gf0y! e\LiV~h4Vs-e*;2(> wnAajI![eܧOA4ЎH\.mʰHaWg|;Jcñd_g{ulb8 Dźi?KG3a(ӽX%!ҌnEpݽ٨IALVw -LfԒZ:c qha9 +V SfPp'x)6 7-2(93{h3#ڟ u |~|gy}׻0Fe,Ϩ ZaA*ͭ"H3.,(qZU]c50[a,8)e:`̛-ģD"14(ఁ!NdV}v;ީߞ_~vrySJdǥ!HvJ@Vuiz@Phz1wڮoUFB>dc.CKnx>UN_9=.5X C53F"`ya (%^^]EuGg z+yXgU?ddj`52Svp)G%*aJLj@r~eg\=e"rROF6,z0U 4YHn8]20Tȧ?gJ2T 3)^i\JAPGZx;t/@5ʜ!_X1@l;y *@p%TT (B ^'M3~ZLprwVqa7P`]u8"BLS>$S~SVr3& y**'B{ b9SQ3F+Ta*0&G'boFkqA|XzzFX9lC, }Zi@#m}|,\HgF0gΕ@%#|JK'c8 ' \$4lGOi.C]M+B^3o׻Z[,!6m=i-Jۻw n),䅛hM15}wb11gn>iB`-OwKa!/DlFS6xzۻqj#zX BL'mۈg$/p-,лnmpj4XH@0QQk.u:̸U%c(g)(2UUphC٦׃1th]^}[.ݱb~n-f[k)K7V9Y*> W 5RBQhZRNJq"-:UC"IRZ 5Qέ=4Au4L[GSQI6u0֚fa_L8Ӆ0\1:ڰDh m( 1ц/#AkۨsSESF:*g* Z8* \qKR%W/ʹUZ9BEs0;fgԢ_>Q&$teJs^`\X<+9uBXc4F`D R9XGZKq eB*+PJаp"b\Hw |'3c BWk{S `u,8]!P\Rpۧ2ZD%e}AA )1T J.`F_34)Ev~ #pBvQ|wM%!;+ 43ũ4h7#϶F Ɓ!ߝE`4 *F]cpi!0nd-g3:jr-!G R9yRk t^c֌D/6u `vZݥޥl/f/FyHs` !3&)a:$q6G01֪?.n쌀v@θIB{-9 PMr@Ľ6vq⡇{ICz?=#>Ľx~ztڍzbpm==z mzٓ:ljzxsJZ 4%KH_=:QIQmM=~X BsbѦ~ls]ԩtc),䅛hMo{7IXwH J#Ѫ:v)ZBձѩձ ,䅛hMq3WtMOûbb:hF|:⳧[ y&eS3S}(%{Fb|Ft IJ<&nmJKŦuֽ|s;" |nV|Ftp ah2k9PcF3{f==|hmq!ءvƯVȅ5Vϻ)9}p=HF߬^GU/CAbL>1$S +A$WF0j\KpI YEۦ íJ,#iX.֏? eqwLV U\?qJ>eK gWbq:ҟq_V4~飻e.Tl.Ю")F1F:bڋz>U{Y jNMz'RZzǣ0o ,3NS{=Yls?wOoxaN^a38 {t kH_:F9!'_=B5̍ (VJuoV=nd_6*nBkov_eRvt?u߆bםQ}; Z7T m:su)ۄX r_Wp94"=@ :mqF(_`ўa>˗! OCE;+Qw͟u-[ԹEjn iq-If3cs"u$ f^/li~Xaߛ{_[Un6}ЛٺrU ?wvU~}fl!|'s|n+;e~<׳տ[_8l-g'_E~~O:^0ͻCiHRq[UĽӸ輤h@@*sh*.e&UN$l!J+ GV촊BKK/Ṗtsas8Y[(D$[t)(PӮ(rC8(19ZмȄeR$|/S+` k6/rIxߖnQpZECmC _hmJy\3(2*) JCΉ͙R ݥ)","cEI5D Zb^{AgK< Gщd A[OՂwy[`J aõ_Uxr3 9^w ;$&+=V nbAilCIk!o!f!ÑwwW~1R(jZs#pK8 m%RwP}S?D [gqEW{ƺ,DQ>0XۀWџ?{Ƒ ;J_od;ApX;'/ 6r$\~IJIC3C%JttWuN!Ա( *]bfDqҒҡo QB"7,[dҡW&&&}cGسGo8(~T Xb0Aie,{mPn.Z+.) eKb c-D!@Np8eA2) 2 ĸ 2 w2Iֵvz5^Ӳf ZKKZJ7hBFxO*[ L)3 g0z0c--P`lK +JmITe z*ijF<-iZV*gqfGAdΔ:pt`UȩC2}?ـMmEh) eK6R! 9i `)CQ'Zqf)@X2p36I6I6I6[uQ]䴔oe^FNS>)Yb不,0fwδΙ,Z j*AɩEI@eδRJң-9U GI @r+5:yAxAB 6h `B578U`mfLq h 9ҙ`$E E8BJɔӂ(H$.mHmdJEc5 S6hT_3ԢrлnNR|̯mJ/N>Υ{W[WwA2Y~zwsvƘ$젻s{-{X,i6[<5B.}Bcja n_\+1ysy3g]K9gxtigW[6/6 q |YXlp{2 Hl (3gL C#MX:m@V%(> 1P#"6bSw]Py%eT^݈Cr?ןЈXtu?{9N2*\T4y*t, z\Z26ۓO0$rFؚpf&2dm=t:2I^F8uB<<~\sfcgUZqFYd066/ˌ$ښk3;ԾV D>_@)_ rFˁ\C@՞ @~Hˬ,s7umbѥ˦* mGĮ1 +ɕn{fb`gDqEnlK@!ЮKdQ͟D݆O'ɮ P*61VR\4N%͕%qQ]WTKZCכ jRåBp~ޠM֌q4 wɛ[ɴҮK3~6VI0z/^eݩUL?;;PB`J ̔zKE𨹀Z C /ꦖǭǼ[~f,W v@fA31ggea|!zi)-nWbOS}u9<:@Ek5; 8U`tGqKήJ(a_glE!T?AFw'}uƜWSjϿq)%ɾRnA12v.d(oM`6ݙz!ѝ8?BHڱ|R{r5KDVbiAbOGָ@ע^0KkuH G) { NOٮzBpl8`W`|~^'뢀 1}R3 k8v!eD]/MD3.z~40(7ٹ,roCۆ=c{<ﳰlv]6>,E~Z#. %/ ӑ4x[lW3F-7&ֈ=zK~ un+zfjTD3'R=xP5,VDZΚ1|XX(gͻ(*#{vh13+f#bd弘#RS҇x7ą>s 0H"A-)Y /%L7{kK1CӢ\zt$IedO&|)my)GYpJo] ֵpz9IoF\}?{C>d/@gOT\hΑJ[.3R't fySf=F'o$>"h J:|+1ʆsr7/t{6'x<8e!rA+~,gQ6SBEOi D tN8T EB4}o1Z@;Q=NM>5". ?]^xklK|=jN&DIDŗBcP xݔƘh 4JEХFՄі h=%χ_^_(Ӏi[ף}l躣q4b4ڿcce?.G;˱aBab@֓&x߼Z JS4<4i|3~u43׷L<7q[7i̿]7I&w$T{Ug>V!C4^@ +OP@JK]0*g=j|Ϩ6ĨMO/nR"b(gkE!|n˺7 Uɰmeѻmq5I/=Th {lOp14cB;]BFK95 vm[sxu%aOcdrG 8ܦ˫Bf=9Z3sbPG5;>Gυ?+V>dOR&{2ٓɞT32+Xᘥn˼!PN0JK  E%)чJˁeVZŰMϧPp)ڝ|cN;0!+{Z"|v㐁&b0)Hd@!Fw͐\ȃi~6ſ4'[ g1|2%#aغ);+GN 9rBQX }RZ$@Ģ0GuQ/k-I`R{TZ(&^HDu0B8dAY:Pc= zE>}+6cH"s ^bmkȦQv!p9]LNQN;nl!V9sR[J4f¹(,A[pm$<Qk;TBiZm\)PRbz1鷀7YsH *m*REOe 9B29jlA .eo^_-:.|8W%`K}W:_Qq\+>Dx!_\̗TND`m8F[{hC1 $|ͽ g@;d+Jz2(b(5 8|6wI |'Z o  PtdGG=#!#HCE)KQFˠmjUm;lЧYNq['&:jKQoJ8Zҡa7EA% J LQ\E(ۡi} =@UߌGc~8Qg5Ʃw>n)Jf,=l:T yy`vИSJ&@<6"Θ(\͜yg3(Bm#9Ҙ5"}'oe' (| 73V!T`pti,EXF-%JT .|,V !8t%1!N< f494Y@FǙYX2#XI9.AD `=F@S%pPԮosT;-Irbuv @0M~N,M-?>]|yw^ūs;L#J RGhDWA" JR欵`.6R$GU]irz:-5R_J hlQ.8]4% zl57Nc`Rz$1c7/II,U%"ȉP"vʏ RX=8 8]`FT%)-ccB)TE?[c8K|6 ⭭¯V=˧5ơ5N(58 PpWv]3$ =1x}P+Bc\iF&NIe ~ߙ6h`uG)5D";Ns SB*@7 uϔB*` T%$Q/V(Q-('@W=#Ru(i7J(mdc_SuWmT,DwBNSOu o6K)R4édJBQ[ԦpFiEYz oCFy-ֲZiM(\#oZHcS`.oh M}\(Hi)\'nTzqS UF1EۚR|*h.lJ/a{Np._'k`8e F^[3U+FXD L*V'nrl [дܴ74{mkzҦ&\+@+:hS"ė+uM6ƬV{+`> vS@/'bO|1_{p;imD΢$〻:Dl ްfyk}=co1߹3<^/5%;؇mw~Zx 9cd|Lvn*厦N qLͬSvFثN%*@[3FZkTRjE >hCXm'l]hf'#uM[*Ku~Sj*nGՐt=R*{3QCYrH/&c='%|Gةv֏Of0z\@H78?|_1wyt~NgُxådƏ/iz v>yX&cL?[O \Jk,䕛hMa znrRDA餶ݎw:+m{@քrMJz;"ތyor$ĩ; &9<'48 қq0[4ȈaC3sYqQyY}&<tpUu;G, }G)SQGO3p;mrH6q"'>0eȾ!Utq S(;*{^lbc4[ X[>àz?#IS7]<.OlXXÜ"+)G5%3ޣZ *ZCKq&5F@!Āײ b5)8r qZ %Jm<>~޴ ;wL@:p+!lJn  *oOd|W=w5Wue_-suK 1$Ԭ. Kn˜[r_)Z ͡(äErCޮ!`THO!%`^+N=DXdQLCBVISZ8,08쒏GAZ]>A)K鱇nESk")5 D'J7=  l*IHqf3#DLjL1NfBnVdֺfZ`a$ʙs<\% J+05Xig,@[]x=U9%SQt7(;Y$0,̊itg{,Akwqئ7YvG"QkR0:&w_2L=d c/gTz?,6 9ğ7̪w[IDkNd8>ubAWO>L:b& yt&'4Ĥ쐳`;s!2&Ә;35M-YYDB[M0,wV#T:8{=Ȍjg89)[{g3XUFO~䪒/WKKFIp(_u2e?]<] m:qn/E'򲏘K{~8k%ήЏ&KEɛobv =\-[}ib^7F=3[D{ÇwmwM;k:QŻSd[1*t[if_;b ]ks:k;] bJLd '6mbiqDy6C6 wIoqt3%Ѕ0X`4W B\P #Us(X>zrRiWNS2##WCk3 8!HXB NWXYB@rZs^fNJÍCB*5QhIID2L*D^H6S ܷ.ИHh9+ `xOPS ,!QJ 9Ъ :0rbB;oMT RSMc" =sY ZԢ)8sV`ANp0* eh7nZ Gn,>|A93V \9$%FH99Lj(L^%H- !Bkxݻ|%ѭOZn'ff[]߬S  O3do*t-z~ƺ%t:\ؗÉ:8Փ'{UX)p_zRY'[SIsO%z$?[7b1G%-Zz୿ԜOz?˛rPLV^\]+ڍhf'#u_1$L!m.m5:'AXpwc$χĤc.ooJM(ZIy˻ q[*!Fw;<+d-b<л5a!DlJ\8y<7:ܶ^yN8 ͍"3ݙ &+#Mϙ`BiQ™?%Bg`0A8$oLn^e@j, %>c33c6NDtL, NT@=ׂ8. i 6b'ؿ>?}c[k 'ʼny4+C+^olôaDPl Ķ jhv|^< .E̅Kzk|_fl}}I` 9Q5i0 aCB 2sDLa3f-%sKyA>A"S==xWS"ivjR=ψ"^A@״_ȹtĠj?_0$sɐpCsÍ0"Fii7`ӝ+pMkӘk&;EEB  Ri Gha!X0YE`F kZUT0|dpBv@X-Q@*3.gng {^1ωy_J-LPrPZQ4@Acr !n8k˲n bߐܽ0gr `B A#4k-V2`Y ZXJ GK}aɽK\2冀 E!Δ0̱39,M 0&A[UCZnptB3+ƒ<|6NR9:+pԅeH"$HطW=ÎήeMXo"__^'x* c G rٱI'v̄ru5oOd|W{xpૢGЎꋚ\EW~T7 @L U+չQ0jbHiC^`H 'fX"Ia8eT(sMLLZ*,3?r-rQ H(xGVC}}/\oӋ?Łkpi1$;r94#v6q /K!rϞXO,}q_F bĖG=~Qy`ZCЭ(fanpVM1׷b8セp¬3M{5OO{ K6DշKE)A$Hu\'D?laz3S-ETgpgMq>T=e\5%~Y 'YbH_3iGai#ǿQGs"R:xJ~ /}ϿtmDЦ iļqFz[ϿϿPy'&v2`߱(`X-7n;7 zɭNKnl&(dY:.:[23T: Lݯg}AbaxEsO^|,e%ſPcDd) qftFmy?ѨmHcqH5͂w)2G`6MYH<<~S Cȹ,PB|δ[_vxXVi}!R[i65O[/&YDb)Z38Y~fϺq)aj1ș:(n :M~S[֭ۙ yStc8:ơrEnQSi;$@ 65@IZTSkfgPؖ aOCa_vn>2!>=e êw`X/l9ZMV* m5M(h{< o T9skSZʥ?K+VawW{r+ ^1xyO > LcOr׭ ~S:&$}S1 e`n:{E_^q,Ԩa-E heo%?Łݬ; ^UaViĎ;toQ}_Bp&HM ~(tKB4O gWd(ꂠ \;KƽsL)9J{DE(t(;^u3N7}~5U XS׊"뼷XR;e QJy̩ X c:՝RKz"ÜShV@[hb/-]G͢eNÿBZ#jR8uj^vK&MB߁/%) 2т&[np)'KXA()L=ϳFRzrg A(Mhk"s̝a6zN=2&QEy*4 mFg pRjkو Qx{X`gFdGOyDNx%VI򱊂b=L2+LУ28dx1e XV:i#86z 20@(@7vj# iQD '8x)AH=$s+ ,+06۞&-oZo{sŅ˫HuCҾdžm@fY؆ P dx-2:}@dFBx/o Fn%sݥ2h%pk4rL9مˆɩt*h՞ )T@،A=<}Ggز=܈тPΙZ/>sa[wjW/`gJ/f6gљp+9xOe ʌJaҊm֛O/-}Vjpɝ\k4qhm\} /m2 fo^@`̈́ls4 Y; _Zod"yw[-uqyƸ>sy ZvP䇄9˄N`:Ye'8 UTA뫃{N:nyCRzGV8(]0"iƞdYnb"$!9v3G=Eう#͖*ߏKs)P͆βNs:RE0[zʚ  _V/7`R>[Xkq| (ctA[wJJ/37]D.`ZᇫKء/Mzw^# j.yح mѸeLŚ&qB3J*tt͂(ap-x* H!dZüRG Xp^K.`pqqaF%SG7~pL$ fJbSeau aR8oC9s}2I2eQ2cr<ٱw˼m.vpnr#FeFeěf],5 `$c R:'5O[dB*%Vli{Wյ$*6wzKD*4Sj%$}OHR([=%hD!:qk @XNAͪ; /US;sy-\ quʹ;L!b!%|ǼvZH;Ai֌O$QhHAK=aj" 9}On=+H_087>9vkyHB U! ͬ J\ |%Lt [h …WkT 0Y 1KBSȈ4t1B]g]W)%i7b+|ߙƒO > -hם:O(B!Ĝ?=U0] E*T%(&c`r)؃;QMǂ2FuI:w#ʅOsՙ뫷Q^EK,9e#'B e)AӷW82È "=]l`j%(Q KHDEH%A}bAT87Q{6s4)#0sj ƀG1qkLHkB9O׫x֛;gVOGme[ʹPkux`_+aJplLUcL Mv X i>$z;3HdQ( A &.,%Wϛo5PaD5*d'8N eDDܻp:SX}!dޒa.waYjma]b$b4iH1][=P\NEs,U԰5ppp9;.7RÜ6L;#gXDM9KgOL0gbޓpTI!& e &H&ITxpwa$ b >@`x} aENyx9*ܰWG|R2rCK1\qItK<ÙHS&$g*ct(DlIO,؃XRX=O{w >|]0C -@%M@B%c ,Lz_/<Op(' y&\28U"V *)* Ȇ) \"x9> NPg=psꛗ`l%b[XX>Tbl-u&I-I] /5QHqRW/Rs"ϵ#d4!͚s,f9d=Rr2ln`tZ"|Y٨ny1?+ o,rlʹyKD鐾u7;L+)Aqc|I:j_wlIM޻`hdjz Rmˋшe=NV-)\,ȠO̮O-IbQw 6OϮb4+Ѩ"<6FGE ﳫO)i3ЦjLJa]=9D[Ts-}HjzQ"sHMwCuO t۫psۛϮMÛzkRcn]K%_׭P-Kwu](|]-&딳 K kߖŗ&t1 ʺVl._S7g3;ardh)Z5<3:B*Ī g` ,iA 0)"\7':`Rp.ۍ~)T*aELٳn$N [-9S%tFNͺgUfZ3OѪ9ŖoXR,qBVAԩFu;]VbӄgajV_ViZ̐>EG)IdZ$<~t@oe! m9Ҷ[[c(3BJOXQ gN""9bXQZ)WQý2,IvM~oнU p*kvPWR!q*us( t ŔJ)gF0NjIk¶J/BnKZNY%{LjpEN<H!GoS^Ӵ\wM ~"y1Exy 59WH7НX R*$yCFG/-aq9{j: E:IiӰ;`rp| QRǖƒ@h-wSķR:Ƀ1O#M?P/X :a0f?+8GgyΜ)E[ealbHcT@hΞỎ?ğ\ )1[\ 6Kd[?OItPWM3=E%aJVg=gv`)sԸ^'1V:@um""i@h޾# ns_|BQdkS&cnЖ,m?TBr9yLX-TS@H[@LAĤb 9Yy9\G4jcaID٠$2G#k9&b0ֈ( %PReuh1*#'RaiD$qDHYTG,,,#6Sѓ]Ys9+ ΐ‘'3أ; 6w%RMJqII%^bUXaXD%D~%ar/Y̠6'7_g[\Ukۯo/.j.6QŽ'T< eC CEج0p\1+OB;FuJ ĒIsX%ꪎ8]MLMfY.U8rY%U{ͲHsZU(1d\8TC*Zy9^kHRh wmiVmj@`ɓq 8r (}|a?\\NM/nf?bE CF D"pGw{?w>?|+lŬc 7k/e,T\9gYߪsqtK%R xIo~D( ӻi(d87Ͽ6}_4ȹ5!&¡Xh ).XE ~2GtIB $JCS=)A0 &[w˖|S>YmV"%#0> NJWZ -@TyO(Kbfp@!i-" BhSxMYBWj7&c"(1c:mdZ +оmѕFꃓYd'Nf̢dڴ<`VF:ǣߴ]}=s`+hDhLQKDh1P\6[:0Xst.9U}7y~3 LDE1@)A"j\$WkF'X!NWR#( Uop:~@)ۺkbLR]'=lng)xȱ=ϱ=j)jǫ+Ƿ#?FȮ93b9=iQϣGQ?~y7!/J|jׂx+6etY!ޜ~Hkmw7W/jWc-JW|u Ք!ksOu nK[-~Sw~PeSH@3RV3Tҕ B5TyU bPF&AKQY# ud4=C?;pseݿ'!Kӯ9KU𜔴atnߍC|Kj,%:pHHJ@7%7 |vV|?,O j&jZSJPy#8Bq(LDc m@}PTzi Q;6VRk4q?%JIS FH &Z'Qw 6'q1? TbnxZ39|$ݒ-W>DwS@RSgE&_o= tqh}粕X\\ w]b[p~ 8W`@sr&&`t; EG?»\߼σ$/Opû dzBN>ˎ/0?TiB%]_ߖ%BNWS#Y>Eohm~k/+遤6ppv軪>Q&7 %d\b"RĹ؂x{ƃY]9C=m0[nLf'͎{q$(`3⩙%Iܹ_@UKpkkx Ubhߣ[5vkZl8ҽS+ciG5G6Uytf9V[ ܉cH131GHFRΎWw*DlwQatLLqv@!8RP-6UAZu E._e\U9u@KIS:ȭ)\Ar*MNHaQV śЄizIՄE QAДR.H2.j#Y…H%AC ')pnL ,#ٛ_XFvz1;0y3[S1*cARdM>?v }-Qm.Zn5RKN/K?/E k{sIx<(٧csEPQ2>[* U^'t2`PJ=aO %.bղ|SW{L>MZ=ڮ 8q5i%yh!4IZVr'C0uQ*VT E8=1=;C=eѼ彖NuMjz,+_iA=hJoׂ`U5m MEtT>NzхVfZ˖>{87KϞ6at4vs+ N8g%`8yEʟĔJP*Ͻd_~<PpՇ5J"x*%AQ(DmH :D:|/ XCEqq(keצ8+~,|Lc YW i@'%&T_in7Ղ68MgK hYKͶKAe9[fPUQ3 Vkw C탔v p5ERa֦>#^4gyoba*㠿񇼒*/%Tf8+;@hLj!3_[F/onpR RG -\NUG{5;ȧ~2\U~h1OF:tr7B=jhE\ݺ׷5uU[ aO-Y 1JۆkM0yh'ǿg]$ !{~~'dl]9.q4ZwփWw_7Ͻy ^H[i`o{ӭ-Ys p_w-cFQiWƔS-;nF>|w=x!QQCb@_o6jkmE}OrA1oCQk/޻JoEj`%zBNYzq |@}{3D8@cD3)/E@8ZXœtiKu eA8.Gwe9*`if`2#Ԁǥ -{J{y vfJh c{3TJyǥj(1sŢ·|iOm&q*%ri#V@fPM=z^!Q궳sI]J8Ԃ1(癦y\b.L\YetyKuoN@JB7RNnF³.P-/6T"+Vh*z5YjM)x/"RCЌAgo5gȈySQoCNlYߥ*Mێ-ΔŭΦp%|/Gz+r(xVtc{֜g{/;ET~q^k*W{c`5˾/>Wtc6>N "{ ª^twf_<)ת+_Sb($9JACۨzQ+23 k̇8K` Yyz BtJR 5uȳ~1#Dɖy dZ2-׻9*eI-罞U9W?dE倈LӵuĂr UpRpxZR}zzzzQef_CN4Fh *EI1$tGOǑI^y@=,l`bh{el[JU#zl/oPRRgeCy d ":Wʾ/?*EQNbj  xJc$6y՗Hb%p՝J@yG!O[צ;-Z5HOcGM@tǎz~̎s)vUȒ_.ϙM)-&]/QR6|f2b>niV 'w8dP`H4| 7nBM no㧛0#BGGZ=2EZȨY0j<` bSF<Rh#B㸍PE)!-ފ)(4nS ;K2O')1U#M0c/HG dRzeT) NAy ?M,/ `!bXmd` F@aIf2Pui; -!gҲדhaqsBi} y5O?ǻ7dެ_&JYoh,ATsI~q ;}vjXI6=<<`D.|%^z"vFQHCe6tPWr]d(lK?Я\qQR|!yS:q}()0uŠJ}Gv_q3YVnMϼf-l>fJ% 'Ѯ9U0$n-*!uK(C('[NZ!1m95c 6y\/C UCg <W)Sj ]v~/!V],A%̏J[t==G[ gwZo܎V?>c/z]qB=phj).EntsOm3Bܴc2%[@3b?OY[?)!:nl$m/3\Mu_x_?[:Iݟiʤ0m} &iNm B(\ϛzzY'N&4MԦPNz7v 3#/[]Dt\ZksT[ j{4jGiԎҨGmx hxg $z9X#\ 8X*S(.gwڔ N k0:P*bJ) 0A@RYP`S.D D { a :8 ZenUnjP&/fs 5.?7%~uLThńxT %ITN-%n>RZ* Ú@*r&\LȵC]C,UT,ObUZ,/}da| atw1Ȭ"h[P]yZբuJuSun7m}m-=C{^~T];u_@yeWMK6mӒM{m6 5>R Gg:>H'&IN 5W5C{ *]Ico^Кpp G1K0x=-=jbb8os *z_T^JQ=wvL#D4Fj3].?ǝkl (IS־ ͊c*Ū ǪKĪ0.hP1ik@D;G4;_,ki9"kW%1C53%gZh=>p)$0-u-zǶ_|jj Vu4%loP*'l\h*aF.`̕KK_]ӹ/zqftϴ;Pޥݞ P`I {Ȣne+fm1@(hxCzktW7ɷۘ40ކO,^GO:,Bda1AaǓYMww}?dW%0"z$#2UVi)2PJ(e)(P@ 0۴$TTR(Χ؈uVUx01W4R3J}gT;]ES/($בs qDC,F9j QZ!؂XP#Rh#/b10%e[jH%릉`T-($U>,VKIx_4vY[1ʅz0_6 70-&T^vw]Xߩ!zb1PHupXU^ ֔k`VI'D#H%G nRR#8yqloGI--YD",zjsLc8+h0O>ʈUmq5RLIdhi2Y3H0XZ.ܟѧf)?(o_i9Ĺ_Ӻ {s&/z?1V oK˻/nC.ˎ]KILgtӣKI*->ӓTX򭧲6MTG=P>>)}rxXrdi'7 }Uw -+dh DfVVU;<][6@Wyrsl՛ygR-w}} b WnIXCH)+j=h\fԄ7~zy=RJ^_&Lp /7b)Z[W{v+Rk! Ȕ!U (/ZmPm'Ivk8;XVf<¿PfрNI)Nw!ܸj%yK{_~}F9u xՌc ;)2lGSUE_~*U?nu׌fgc \㛣]+"`4}ƾ+w~- O7L?\m2ƬnE5k+YGXg-Jz|K#!/\DȔh7NqjNgTn%?1P!!/\D%OTKA0oM@Փ;FUKq pMЁ2]+8 uf,v>˾:G\[ϸI2Fń[OLMq>0݆./ UACvZp菵%>%fz7^% 1JG?l0Je׬V ºO(qWLqcҠ K1^&`#b&`UȄjsrYZ}h.Z L;3QW='Cyq3v $;S#"_~g<xA di)[i-h^]`upqpw^GvHFQENH5wѧponqc|s`__|6Y|yK >ryvŻQe-'>JC^w¾fYPZa:ʰ4KR#6*&IQ, N(O'UհR aes1NN!JK1UI (L2) 1VŠk9 a%Ua8ZXĖXiR  !1U@x."Wƫ˱KNsC&7jqs$]-)hIN֜ߒn]mfn_<4DDh/?dbU>uG BgnbݦDg79)xJKɧTkRL }5,?tcME3̎l0Q ˽ש6cSBLC>9aV-W .yƨ!W||r|9Fa5AAߧm@-FS3_iƕ.A1{97`&tI. X_ݻ*%?cmɼeGA[;tۯ1n4BK'ZY 4Y&S3D,cC]RCY#$zoU*iFT3BTJ{Q5PeM67U)l\jaTdTJ{dWDz{gf!pr\#,`&Nj=N%$'B5)87AIJSLdfPXwPW\n<>՜q/.hն-BPXJvJ}Z `-*ӌE[8N"au »LJ=4^n_KxxB>^oA>/uo@˚wcfg.A<8~O Z]y;S*Sy{ڳ3Z{CuC{z*-{gAZvUub0TqdkB9ZX &D+ʔ"tK4Y$-N#8P1N ['aW\ I K#hTMx!.pƇ- 뫂B՛کj.-_V!} x_U߄* ܒX)4.nP'ﺩ? g;juzXU]8xCbuQPViJ ?X4EV"1N4-QfL,ubCR)4vsG1F>H×jx0=L`:ä1p4K˧Ҧ[7,M$5ie*If<1pIIF'[J`ŰW\՘MT v?@ ἀQ3Ќx4 Tq̝Rĉf@=LE7k|lYذk=8ڦ@k9q9oc{\1 E8Hif#]cqG;9z,؟bu\g`Mm^h oxr^"҂9f(LS; ULT PK ljpZ ,S0Fئx<QLNԿBj6y8ݔ u LhQװŇZLy;Y.qxޗchzN::7WyFWj=I$t7TW1B K=ԂLogw/l9ݮ./~rijOjSo\X:V2R 3(LH`lu`dN.#}dIO^M.JoVG,?+ϋ`,y=l6u[!9uuu'=8VL?2,7dbG ]w |dI2ݩ+U#!@d@25&t;]L|0a&8 RSذ:trt)GSh@Uj-'jBF ̬(3V0L"V5\XHfICdH^xۇb,ޟƫ8dzisr " &y "[vfA5߬ugBGK1juw~[O nj$鴪.zu֏!j\ .E=FlYwR<_ݭcs;jhUwC'z1;ו+-ۖ!l?Nͻk\[qk'_;[KzNP{> w.tW2pY}pv8\Ш8t-ɕs턄_f|DŽsI鏉]BFn,7G9" (<ʫСSpo.tȹ!6;EwWDQP5G0v `^Ν uwF'C rZi ۲ T`FtںUmobkE0T7-%4=I!I^SJTx_βY# nyqIv5G SvaXm(vPM=2rbf7ڌ @1)/enn[[ثvFoU#zKfm]f%݈YVn E+M7vv/nB]r!E?̘Y|N{U*;RMwLepj2J1ixK[/*$FǙK ZƑ*iF:"JAųBg}̀;5hG58pj0ѢOdS*a~\h6Z!]"|X9  s"<;2̈Uqƞ@+ƋTÍ>BvOJ1nFⳇuxK] |fe H+35lۣ.pg*uS.#%"]AbQL"D|qeT/ɱ -_}N8o{2K(Z"Ѷ^?%YlAV̟/>0mmgO#YJ(ItE)\֛[k@LVkG‰kT1q x}uong7.&f?2\HXZ'~i>zz*B\Mԗ[n #J7ٻGn+WXy 0x0v v%Aˌbu-g ߗ,[kbx;åќ KfKDR3 #6]b O{>Uǟށލ>fӪ_84+lKdO,o w QRa>Am|GqXIZYAラEp(Zd{X@uVG0rS'ϟ++2vbe<{,Vn~137KZhB?6|o<4XfP1uE95O+';|>qAvL5 ˆ X !-,D %(}i3;Wͦ.pdNSF)(w/ 3U/Ex$I$ҠI5DkD{V %MܼF6R⤦-7me>YA)$&iiZTekGUeu]K<ި#ٸĠJ/8bTC"79`gb(9f\c*ⴓ4@i~]}7~7"l8ߏ'Pp%l87qPCU]S>m#?x3`,dX~o8]Cc}v2]ҔP$l+z)`M}ɱQl3_ˠ.!TBXA`Ի+G럀9aK{+)5Dp]FRP ! "-/w)hZK Ba:"=%cR$?ؔ~ri%;WD+0EZuP֔SG*(Ӧ B R"d(S-"XBeIS f*5!4VKM5Jz`?34L#%wOZ׾ $wte#"3CdդX~1ZT __ϓL9w(ځ(I>k? Py1炸z=Xs!1?hImgN\po 2dF)On~{f&CzAcKHx>8l$2ʷ "û!ËQ|7_=hI>׻˻7PAoG/.r=hZ+ يm.XZܥӨ='܇?޻N{u+2i8ЬF?\dU ֬ϧW 2$λTA>Z_ٱ80R?Ի7F?OVm A=9=½RtOF#0O$Osy%A$M:IL:~Qe:G6 1ֶD`$7BT]nyvMC-ͬ(4rmtxt_۞aV 8OeseE,g1 l56RC${ ݆#C7̈́0gsMz*R BX[[Wgku[m[^[j0?R8Mva=Μ:drNlpJare+wa]kng1Ӟ[PWࢴt0Y߿E101 Д0 )8D8xiG #rOjex"^k &9Aj.k8RY[iW/uθgwg6= &sO/%bh[oOJF%YZZ.)%RYF$c6Aeuk0G qG0^)@fj,dbnWWSr]C iã}\7>@æ'냦|ѻeUJs~[.EA ?Z dVJ>z$sc IJJtSL?W{}l^5rhՍaUflv6=C k?k;ebFOUudi[ ~~iaU?Hc)ywfjՏ|<_ un ]t${0E*^ufߴyYzƎ*M:+7b2#m17zey!R~<ˢ|w=ϛXZ=wzNL<)aU+-)ry=sw)JijZ4$sQ_MC:.Ͽ$!!ngOo8U1I1Gm3zuE-$hl|Ӆ\&i[FQz${0fӉ?,ɝ+a BbL  cо~1;mr|+5 Za/H؟tf/9.@>O 'pc%hq)(Uj/x J?%lԋsǺ80娄0/Zj73v^" )[Q:3\tr Nkh\ρ杗>n;Eibo ar9jњy BF=̼u00Ev ^1K+߉걦(ַY6"ڷz&ЍNVE2[%zXM߷&4kbdvQ4 eKuY=۹iV{]kcS ljÝǧ節 I5XW'!o<ِ,hsyh2]!| 1_[|c%%]{[A0Zn o(#-ĸUH:)v7HJG!c[w!cܯ ٝO;b7;b~~DMGȜsScs[>.*vlqcocCu;߱P$ħ  0IU(|V op2!0FAD?&ϟEٱ780Ɛ!Mg7ޝ AMey'U0["4?i'ۉqWlQRyqvYK+`CauͿF"x&-PAADOBB4~b7kM\߶Ocz8!"=\w>T?m#UoH)W2Aewm+Xi^mCgPĔekƇVw?C&1cWӏb^?~_ܩ0+a8zCVw޿S@C~JI:["@(XYQlIvBAwUGB˅" =ܣs(cngݩgs3ȣ$yA"|F\ |u9_>v$~9~i<k6i=d8zܝ}6Ԏb̫*,|Uy4=fQؘlɯz!Oj@xاWN<.dnd4/w@gWBy{]Q]~ϟmVXJyOHbH+(eo+y){+XُG[ ,$v,i1ҝ_zW"I z0c-ĥhEJ _OxFeY8rBUS \✒ɘ5 p}հRg꬞y76j&44ep8|&vP3P"t324vS)e(sFixP`lfN .Ӏӛq᾽(=3_Hdz9rM!yI&Ca 2&Jsuui]F#}tבyܛ۸Km}% [@^!\_AvAGK{KuAR%q_ѮtЀ$p^Gq`1REǸT;S!]I\b̞RTH60#@$ 55c<+ TzPR0 k!mDT!>_Cy.df[Yfer ֵᬗ*Ƙ3(:njK[pYвmBd#Ld[=At$tυ5ϏY=&HJ ^&_Cd rF!/"~qqe)(ø B V9+!Y1Cy,+JP_u|.^2Vܔi^v_jn*y}Bi<%Objm*xeD{te{߮'Ѿ.[I FߓOhܦ͉ӥFns4{:Ky(Bnaqŷ\Yڃ߲z@ ]Ÿu}4Y|[$Mzߡ5)̸ ܢ %|tVjCtK(Tv$rpCGT>ljM-̿>—w`TYp%供OھoQ7{/yepC8\=^giA% d4A,1G@1S44a#|@~{78y;r'0J(G.^fVyݢsbk~/&yWHkV TvxxwP"T2N+;_X^0^}C?PQi1ܣI NRS*1H"SQ:ΤFGOԁ&Vö}`ŋ( w) (XQƼEaKӡu^~c]8poom^/v'DT۴r#V'd@i%@d23c7FX Oxaj{DGts1RC_t;gwLBC/ڐ LJXjtH,1it@&"!w>\jƴwa< үDAcU iTS~qg`o¤ \LXX )#I_vggPf%ĎW l(j5'7"Q>0itlTg}YWefwx{ְs(A 5AOI^iX/|ۛb澾_-Lĩu.j7eOmd'<\oDfH߿&_}Ue1E(~5 +3gYo3F? t{DW=۸me4k܇sLmőiCj>_9ݧ~u~`{ɀ$BfQŕˁ;{~cchVR^}Ћ`{JP}i<ʎxsoJ.Rl`!ډ !xMY)ND\p=$&p*ڻ`1ǃE#Jsx?m:IE)ЗfD`"[Yrls Ψ c ĩ\RarkFy DY+$(6'~ Sc0Pݎ%LR0:_* kLEjz>s y6(MWC9ޏR!H fj~7(&aٺF}XF#۔߭GD^ [g`5U5-մmѾϟ=cKg4Dӂa i1&rjodЌ'rΤr~ ^((w7+ GMct{2c"~*Z-:wWM`VW=n~~v" _]Ź\^Mp:'tҽ_S`}1ϳNWuj;~tH#tVZdR*MbP5EmJs1z0UQN^;K51,pM!@AdF,NdF t2mjVGZ8 +.YcK/I i"{R/I34&i(E(=.µY#Q=(OЁw\<8mę]+0c.Od_ɌO*+|\RIk9.ɇeDcu؁#!*yP]q̄j =*CNx% cR9nR'(乻?.I܏1BV.ꦤ)B 4"󃨊ݵN>r23hgN*ݔJ7˕͐nS-UA݁\|IBw|"\na=SM:D%]$LBx$$RVӻ]~gݗw rU=|꼦tkz˧5iYޤSkzK2ڪM('35GY'YX~i_ά{$JVc =;/_o񒡪1xp-w-siQ+`mÿ<(KRx(/.tXjq{%zX :mW1Qz#ccFv^u. ϿxR[[`(rcewuGߧX4-S_O2ں:bir]}Zr6b(1t_/_O{ ilY,ٙ-Y, 4nsfszpMG!.G3q(Q9NDy I:ݲsHfȭs",!r;ʱ^!rś sgN=Ts^ JU75;,:OXyc5NKnnMqTYQi FJZ"6K'Z)|ΈQ8]C+cKk"0` T`]J5DUz/P_(zoUnfT.͠x:M:e 1bXavSy(2/nINRJَ2iҦ@DKm45Ya}4E) &)p;lI/3 +|:TԤCXYJHeJ]`T9 X:!-͵)YÎR="1Bc5hVV8 zk` z*7UHAh7֌ =[11`RT;8T2c/%#家iC]mklwʼn ,D#-eo]mlS $S`DkkvϟO@;m{Sk[iEt}f_~>5&5x|p)`~|ʗ+nl;nlM\~|mYkg& ݝaU]z!­L~|U'F]d1`Um]5i;j;܉12LN_ z'=ZPR9B'HG'X5a7Iu'l`̫U-G$ʐ]s<Z%鏍֒%Gxhq;a;CFݨXh|&.Nw^cBCZ(:WS`>[؉e٪N<&ygJeVHC$5!Cl%Av,1ߚ@'CG y4^PaŰ 4w:0bI4TzV6BBGii9ބj]%F7ZkOx8e "ܻr)1{&T+ؙjUU"zCcNb5ozutNE;lu;ƕ/F竿 vvF^ &4}GIy8e `)! }G+I*tiy?cSEL-9}W|EZ"(uaSK\=RBCXUXY潶Bm*-]vBA5 e4}Gv>x(M5htkc<%S Ij8%-rPqJS(J~v\+Zֈz4BXQI,q;D^$*7(g{qόbKL%PGdw>%)哢n1/o2/țM</x.8p[|~WϮ<7E23[^IȖˏ }_aqF3M}~[HC35m9%Ba Ml@7_?Mss;Yeatzb޲k`|{ d|o/_O+Yirw IAF~F%cau[fJbII" -|kuu++{98Goq,=.鏱 ,X=>#ς1 mn#/zt\G._bl69ZqqO0+~a{?+~ڐgb)Ť#>Y0UO9yG|2[{blBeOw6rT6>Y4a%&B'KU 0dP,zRy=qOJM8%<&Zr[ɛbIGr469[yS9.sn X"dq )A\a?H{:ZjA ڼ.oʅʞJBXTcEX25lv20P,-(,W!7Y W34u;j6FrvN(D kYz{޾vWH쏟s6&B4WdLV%{UH)±^$q+bIK~SE)aEq4Sa%$H^hDpHxCWH:rŅݛL.rk43 *W.xi4^yrpLKx`lsUTSbw)q9O2VPDzܭ>QC]X4 %;+ rՓ֠q9&+ $U$,>ĸ4-eUQR+ZwRgp$.܊U=ڜϗ6==Jce:acفwPaXd;J ⾻^]1ki̩{ ?80&hUl#&t$ҹ"+O8SOEI:E}āZ9$'S'mȍXmxXRB<#c | BIc):F)b-#D|;~[ Uy;){tL!HbNQ t.G!H ːvHrA$gV0b5!,];v&kHZ<w1 9CQRAILzla0 sCQŷ`$y"cJ̨q Stby+!HRJ g : 1U{i#S:"cIl%(7sju {%(}{U.4n4@ɨL-ّ6FP2TQHrpWlT+95OɁ  Hѽ)(#N,/PxìM<f?m`~&kZҧI%@sӞG_>7ibl^5BOwa<|yZ>>-o9E.kwWa> 5guE;γK`SDtE'n ԓP 86B$:rF(p1K%aFHT%] R)FֈV,rIrj&x*\p"[-.k.)k!AL[D"JFyąח8R+4l 0 y͕tܸXF=(T֝DZk(P5Jgݔ > Nۇ>xސIAotkďMBw}j1|=`@ 8-"g&N1n\}xڶ,{hКOW~zg8!_5Ii!Ԛ$iCuBZs=U+é˪,=Q@1驧ڱtoXv%HGPESJne7SP:4  oȤ\89ʦZJ 跱A)vJcqUеH8>\gf M۲?Mܴ:]j9F :*jrKM+Ǻ_;)uMϢ[uMgN-&1dYMӘGk;Z3yVAO6CIxDq3m _Δ)$ PΔ+6>rLٍ97yqܕ^+(6k̃TS7('?КN(J!<;7v "a]2֬%5;ERZ67r __$V) zuk\(5ژL[B8%e tK2 d|5/G<,tcc1_TzfoX> J`zO#El9 SS*eR7){QM9Yr/-ۥ{8h1, W DG1^ g9M]Ѥ㴼uqR7~-e 99ڒTj5|rxXj55 5dq!w= wVʑy蝘l}ny+>Ahm܃LjdF >, Ef.u+:G0E.7D'%i[$I?UM" @ |B!ep.2A*h%T\` hk'TQPTT4kZͥcjLYR/)N;#B9kڗV*&'P(jΖ-VJrV awCv>ўڃq( \2;ޗĻ( b<)JJp2 G<ӡ=(x-qisԚN0d"V/s _׊]+ޮJgt?F*Ǜ2â8(B[rqRXT'|qV(c<m=|3XY\>elJ?N+o>{h1wWKxD^_aRCAJi'=ܶ9o@~wEC58_n8ppT1MB D(h#$!HXTy-at\Ǩ!T H H/+v#PpRfW&B`YY0͋PJ J'=Ձ o} -E"w|nJ% RpbP`p8hӛ`TyM0֗S\3^ cOfTZj Riܶ6pS`\}LxԆiP2`e KRqoa ܴ\쾜||ߝ OR D_/v^NDžq._RxZykʰAUKF%hPI!Pd%VJQAiksVYPB=Fg'SӉS?_ZݫUN\nK-X+Y~m#x : )6H&~,sFΕiދ+8˹ D7kNY[Jn% .E i㢣F:TjFz.覑nL7Fbc7I~ Y|`Pq%Y~?^`p>w/}"#&MN0 Ywn_LkV[#2bH猛ZIj'X ރJi.UDJ|.FqvJԦ$ Z#4skH8BEu~,(b ,Vc%&pJߦ)ms׎D"b*4JReE*C Aڒ!4\+pAL5_=AMsԤSkWbtRkQ2A'KNU*~W#,%j=6Bx9M ]Һuyb(C`)( zXJu + 3Ф,&=5F`jxGZe%5*15*dSjxe.tqXɝu3&Y9eC| qZ^PCjuw7g@$o> \U)%NrxfYePꠝtµAK8R +iAM|u܄ib[Yʒ0`ƌPIj:(NA HTs1*QBF25j1sq^`zSi@R.ȧ!oj-Ar߯uYs4 @pC>x~.{ NWOEvbS;w_eYZE(D*+!R{kHf)L(@APOɃ+vJS;uM̿^` g ޗ::2 s5?PZ$1x%+= HYkseʿeYR[ntelmTzw/X[jtCRoSHi#Jr䑇rӴ2-9[%w80!zƘ"*Yjw;)E qh0qcevSkj-۫/Eormtv.;u"*'oUdL}~fvpE戍qC5[IH.o-pW*ńx>p]-9^->o׻|`%pۇV.>ESY=ey?ųE{A!GxE?4L|>5{!apGՌ*/ ܗJ A٢( RWF li ($z*EH 4z.yȒq%w^BqkEKS"jkP%Sԭn-ԛ]7'f LoH2 ):%d0`i͜cEB0S٠@`X |ʁ֓ײ`N1t`KJ\8bBiBdYWv8Cp]].@&Ex; %zk$:m:U`'!cR#ޭv'o3ĕhB>AN 00a1= nj{wm=i8 ޸ KK+Bj-?j!9(gr!̆K7ISpdĚ$mZ$=|dטJ^F|$G[fw%{fZ<,FG395޲mo}skqWg$WDPݚ;h*#(0(  N<)8,"6hnZw+AΑgЫQA5f/:z ;F)Xr$uV:'Є:%%NXu 3W'?){X)$u2<W 5pDQN[4Y \Y+eZZoN.*1Ba%c?1Nj4FIydDā ުD)Z΅1D1V%.Ԋ?ԲD`|ND֠~QpZo%XJ4u:C^ږH + d- EL ¹#d$'B裠㐑_"2rfBF#F*V$adN+N`<܈h44a:=2Fȹɀj;T1#p? Olotѱw\+:L't ƴ-3a0fj+nz> N;ӳk8U04* ƣ~.,-[cL&QR,uP}x1zω`@ #Ȱp ȌXSㅏċ(ۣ:ŠMeQ]\1jfPѶĜiO% eR8G|:=DC*R*X 7""лA!҂ޭT)S:Gw[0::58w+Jn5,䅛hM3F w+AΑݖ.4>>ۮǕÃ]w5@8=뫎uw}-{s-=~i·?i_<#=#zKny+EG$de1 ؒwyi;LHbAȜj=??I1Do<@d =ޘ%p+|> Ѡa#!*b&dw..dwO#{=iDL'P n U$'"hk75H{[q6iU`N+ۙӭ8.)(Iw5H:~WsS1ffo@rx U]+V/0A̰(D zo7w5_c'ޣJAT%Ol!(өAy:'"z* lLuMk)dːȰnuX HPbP-hڠ&ŵ{!lW\`ysA2Yqx C'+{u AlxرJ^SKV(:5.򀩙Zei+Fyfl 7ޛ$ZnF5xRN9mĉl~qjVӒjX 76E2ϡWuJōbR]4-A:+].P(Oc"}l 1iBy҇SlɨueQ#rjrų.4)% v?/;Qfó͇?B^\] sRo y=A/kɃP&`ԓ1K 畻92n#ӵ:x)Q]AyOjocMSkl7V(QcAByS ^;O^2B[ F|qiL}K:{7mGԹjp>ſ5HjHa""K ~Z&vKz)ӬYzBObz&I'X9 ɜ_zHmRI> *A%W>mJq dAH|G -dC =~=Xn2T0]bHyЍV] \+~R0b>-'tT'4hv~j OKR9^D^j%"\D| &QNW:|]yi9u-oZ-j$NBbW`R8v koŪVUZ;&Sڀ9&$tvUB68@%B[NTp&B0FߗPej\`2VZZx-]VSsZm22$HVKD K e})0$>´g&0mJuT 6k,iO##Es# ԩ-Lwʍ[+js8XY,bv=:tp8O4HIFK\9Qe23D0Y`VzwTVzVI*Ć EUv[M6T}{{DC`:c\BB目Om񗶄bhT(\!Nq=#R+n*AAAkBcm(`:b|,e`Y.k3os`Wieh 4G ؝g>eS'b 0n-ywf|s{`4 #N:ѨoOmM k{w}5\H4pDɑN~+Yj6 )vЬhdxxȵx`S1 $Bm 7f7Mh!4(k4԰FwƝ\RF8d1/Vf3+,hVX$R5Zh9SF_f*cb `C`ϩ{0ylu1`kExGr'OPbP2>|`f!(kf3xɍfÐG׋nNmo9WP?!R \Fy77 lwHkqpQ㣾{7C^+6w29~Lv: Rۑc  ?{֭+UtMnj3Rx:mWT~4H9p^p2~@É$nmdB'+ȉA+,K"d-T{4qjY 'dJ6hMuOȗTk4ز# & Ԗ^%l*SFX>EVVPRАT $sP vT^wtU#@WV&!doTM%@!)c9h'٧ %Y6d XP1@ճ YɎ+h]FPUAʪ"isBBQfkj5~ ;m%( *#C"lZYHId|iD<ҒCERф$jҸy1Kl2?Z 458Q"!dcΨ3gShMNQܕ$qjRb$ҙD Tt!f# v > DTrZ,_n$yp32)qyߕj?MszYst4 i^` ?˯ fK&]+n+Τvw뻁F∟Y*svf?v1Vb⍉P1¡%yZ}e_R줐8\RIQg> EUXQ<= l2=-|KoWK vC vIqb@~HܨčgX{FXBxдoMn2 v $Ѝ2CE ?sg'QXum5ޏcZ^;k:S^/Fjb ۯbT5XvOn(h=}zI Z}eקP{Rn'yc<{RRހ6lc]m%h.c|o2So8ln/uwnn7 \]RS?JKjSqo3G@U D{|R\hڤIHHQm_bш_U__mfj*?8KXMDOOm=V粟w%Kn>e8đ>nslKrُ^J\dz\ˠiǯKcj`n^4]Q9a p/2RIlQYWXXzV%4-MpXZ}W3rU D%HAQVm3,@;jo9VӸl+_ _L钱/J9wc| 1B;]ƻcCޗ~zZԡIlx`*8Xh}ꔧѻZwT }߫yE2&)t]}7%z92X (]=g3u=YJs9ǦTHbvQ]k/T:`4L{G%/.8XDA!b֤R=#x5BxoqzʺS21Ӭ}Ad#sICᩪY6b,֓.&BF5'5%'U YESZZPhxrPyP.@+xe\5*LEBszdJF&?,u?F7<;x"IHF P[WF^>x[PxxЎ(Rwuz8#kGP4[>2L|oXp4'~q\NXg)tI.ZU}H&RwrTj!K)qhm'F2cg(#U*Wб:g_d2%]!VőLP ^<߯-g_ 6)BҴdPM`inE0xqeZk <6k6dH&("$ VIJV4%g9VU% Qk:j1umAGdi*[ UMds"U^Yba]RLQ\!\H. N$9t @M[ i1$K9+HAgzBmm1٬$ܺ0,45d5K80\ dHt(qrQ:.{P蚴5Qz*[Pv_C.$GYL$L3O*a_$K%JJ=uKnC_q1v#ZV{a#[ 'kPZk~zPjabLadeaT3aPELcGKG>iMͿ5e 6AJf>A?ק{L#F+tb!-}U;ow?i[$+l,ZêU-=`EoAz-18EʝG5͠t}+)P|qXb~`ޒ s;BG(G AFb;֝&ZtlIiEv<akԡG\lt'`AȰIwuz|PǸc,.HZSж1̅lB.d *uDY{#1z4RΜKw73ZPJ@lQ'(} "U.Ɛ%fy{h*trgAہk[ m(vWhe?s}#=|q hQDyAЪ5z/#LW9Y{z3Jnm'Kׇ{!`, iEe<{|N15F)~u{op,LctsHJX;d^3|>XivH20v  D.d?$!B"K!)=#pY%>>#g|wR(z5H0佹\0^k0 >u۲ɹԌ3Z#)\*(N)ȥԂ=)@3ّBINR[s{vaY!dx^wPg/}^{/ؙ,nQkgW^%^Lu5%_[g.T 5F|;cNENGqϤ2y."ֳd>y,Q2Z۔54 ^ڌB^Q*E. Rf2[4yUkL&s1j1NG ԇLR{Bfٜ h,6V\Jl}5f.q]0ь1E]E荋4 z )Rbs@grr2gUyץ}n<:^F3f>>pGc v=Z^]~{H l!W@tuPsn˿,̌d[ޙq\@moYkB4TQ$U+ TL9sA zZ'r|2[ϝ^]*NtS흺 흺ީXN8ʶn<'<ٖ$e'Rvl.8:H\q%2zGMVUsTOmFj:5[_w9/į6#@ZVɆGSFҫփF@jcIPY!O:Z\X7EavCYzՇ%ޫWR(Y-DD5|M\&K|#-11{y񊮏{^R#dw"L6ls9`_Px6 k}H9fB aFk^w,^dlGbׂ$lGv y' /cr-_nR!ϨX"f,Kn +l <o'twnQ> Ž^eGZܿE xluIo`;5E^\vHknn..EMbjɽ\\_nnqW?/ocn5|ko?~>kkE*uT2߿~n۾oy߰?}%[V._2HRzqߴeviH#^sY]y};BTFɀ'|^F ,CycZ(&aRu_\{edzDڸ6LN[4URVR#*^%[hKcx͹!f]"$\zZ[t-hdRns.*ua o5)_i PBYL<Ӊg%o ɓ=P\z{FXBxgYGaJ.CUݬRҠl(Otd}[o{;՚o+#GE.j^$ N{4xeCG;uI̳J!AR27h rοӻSK], G'8 ,Ne Xn4Ysm<$!cIK\MFAR ɳ)<zb7RR\*.Eɣ=](Rye;UZcjt Or~~=?@;.Lv=r;E˙ߋ|>cZym_9hr~u}>h{uziɍyr7T85HkP)[XpoX;i Y$K ͲQ%pN8FfG˼aegGmUb*)&"2yzYn܄?.o]Y(W.<6z_/*U4RX?׏/,83F#[ >|;Mr[ҘNw';2wkm̾Y)Ĩ $ixwq_yv< ,R*Wڣ#WLV?nnhؾtxQKAŻoPTR)f]Y tPўնQ^*%3W7K9BQpv6 C78LD y{3|ZֳT\*ƅ<x!u!r ORgC9i vx5K!V^CAԹk(kXVdEL(~=|j(cXh,FI\PݰnP$`^rPgaTJRG3kQ8x4\,[>&XlR aNq|e={KPw i|nU'巏#P1A z(*ëS3&+z,U_FLLs'y=訅6VBl6Zwgs|2Jke @q8ˍX[S][yBޯ l>jRwk TirݟC{0g3hZD UiOtT'r_UZv|VEp>uCH{`| C.r=K=h {yX3kfT`tJPZc0a2.u")^G=K䌑j/ MJ w; \s]dlR啋7{qW׆Y1žf)˴կ 9׌W9ukrSZq>um 5(g5{(nS㣋2#aKv˱!e5)i_:N tT|JjcNVJ{?%C>;`k a lyRLIweNzlC{cWpW܋j%L\}&Ԡ5 xɩJj+p<Z ׂ9~|Yeϗe>_s+}!:DUH gSk c )"S ꎯC_4З,{N͓WS>T;Pp-5SJ3Y 8}ͳ% 6 pʊW/ fL!fi묯Msr+`s#`JTm?g5Z^@~/UTv6`FNEA- LHsP rC푇[6?RҁN4cI(ƾq↥PAd sV0`l'x6O-h"u6 C`NXzbA(#W*'ͣ5A6I((eRC/5C +kF8P e &fԔĨ.Ja-`NA,:Nk} :,ؘh'd8abc?JBI5FZ"3}Fɤ]*h`t&=[ Q"Y(siߒ}!qqgKG,ɐ#3;#,e>K'\@dF6<@OF 'n0URQv[x|&f9O>] I K-cNJ;L Nl ѦS&PZJs2(8QpMKZ6\&oe26H5x4.([=jAG!HD5D%-_Ӕf٬?Ն =gH` Lyd#Ss)@t,$s5OPKWAArzX,(^{$'LM̑ :nY!TA W9D`Sms^s*GI+K0+r^*O% j%iЉ6 )#ioTѿ^?{ >ɷ;p w mhnmdHSKq]0D;j'}n;~H/_~R!اUH# j KC*GNHPnS4IJ]6!+ʖ5 Ƭ=|:NY  A!Po~qd~,o=LF$(\p wZ~8cSFCN 9(9/M=m k&W:%u~BȒ URfR5uO8--ռey}{1]4xo_%H]r Uok76Iʙw:6^6RgZJkLdEK]jQtoF}|{G7Rj4U gǮnĺ9{ c-%VrN־ +Ϙ\~nx?G]-)_?(U1K,˲/Rl/vULe6#zHRTW }Ѡ/?lZa)3hTh}Hb2GH;ҘAxi G)1ˌ28#rvs>"E`Ǫb6bx@5wI<T6zms嚗OهQKcB |.v90qzؔm|+Щ:LLlElK `CЎrJ䂡WFn2s. 隱!v"ֺ ܐfy|=q3]#.W=|qyϡ.YΚӒv[M96]m]g.J7YMVOϮd o*Ur! o2wU7CĬ~Фf;z, -E}VG]Käds79-|W쩅fKFRࣇRdYzDŽךZ@0Y^'^M2 G d( A* __`j};LiNo‘|aѨQҐ+ ;B1M&IOKqV t%޽B3NLoM?~4'rR7uwмNKnt5Eewɡธ-wOuw_/rf=1^=Wlpú$;JI ;LJ)Y]鄣06<$R zk-ow QPL%fpbec=*4?!a uqxK◻ B/w~:=CmbA+Ry7upO|gfu,%./;C$/?ϫa%/}QMJ4q~)<J֌WV } Ԡ`zW:ALgR75HMȆy4d;rzIK&v==Gxjunf(?vX~}n~DUrvkb#_)j+Q]{J[j wswU@ʔz{+ ̧KChOA>-ܘA"* Gwطa[^gkR9A4 hАl GxATH['Q5܆ÊY}pBJ{UoHjN~6lLm t$Ws= e?)"Oo-]X)ad"z(pdc〟M2M ~ zG))%{2p{Bey &Q#GѻyX_3*2j BJyW6[e8T>{hA5̻zU ؂)EQ<\DdAp$>tdx`siPZn (v??{WF/=i[1|X$wff-~a'e,gfpHI[rK"M;Xb=b.p%8[N90?s?]LZDmƾa\k2 AT Ae@OW:hcq\mm-lOT d5A%yt@0WT)*8 Е d-.@%WA<QCzϮr>D)@Ko7}kU+=uk#Eaܦdf #^ A}ݸU㌮dk?v"!jH{ /ɍrʑ)VUap. P6.6}xIo-38cG򶗎q~q =c@X~6)>~CEzrh$k_}ԏYHQ/F1"!r¨FM{.yp=#Tþg"'{ &}gx嗵)M5.!֚낁j.)(fLֵԶ.9gE%ϟRMg 3eʋ[.B! ZVTRB##DZcMM쮀 -AŌqw Q#!.;2 ~BSk'ZɌ,̻0JR?%%jN pr% ?ΨDE-"(Dc`"] OrF$Zyz:Y<=n9 P&8rP&FJbE$TCSɦιVDW(kHƶ/9iS[6=T)RFp5+5/ %-@F77D˦d` D]4s-*/Fk2;ۉ}xp­|ꭞhV\9Y~1AZ~\ս?}u)<(^>b>94DW I1PTHTAε1.hjBU5l9FqQrkmT]CJU )BQec43D4UI[^ڋ{` Ohm>L+=j@|EڝQ%XN Ԏ[YR$d !oWLu;>x~\$\f (() K]2%x.!քKKMQTњԌ eYܢ$**T*KD@iy!엘Շҟ P˨ryAWJiO>7TN:HՕj:mzQ'Q2͍uy=E,뭟of+g#Qe=K+ȦɩU+%]^׭7 ~:뚌^CE2U09w!<ٍ]ߝOtXƭw$N ®2¸NU#|t%5EٌiZAeU -X\oG}yqEڮ,~s^0CxF~;{@{R2!Kb5O+ ;T8$~n]A eO<pe=TzX+#jS}YTIV4orcfDI ȭs[ ])If bjҽZڅSy v$"9s&@zi_CZ y$s #[nJUjKŇK,3(0qbTUVWzZ Fz)EQ4=v+SȝpC .ݭݛwӓp=2kZ%: _?pW8閃; /-I%8i=S;+] B$zwP+Fx"t>{yٽ2%yzDe9k]HW{«hB훑w>4kE9 ڳp{ ZqW*!_coW/ "-|r6Y~׸8\ǴS ۹ =*B:g>7 (0rgS/*Pqh0@ɵb|IP@)1B'I3)f౮;|\>3 )@LB˺ "jU"qO V7Da4&6- 0Y=ϚT8eRZaY, SuiXST%y誢\p(E0idەRWzKL:~\.-:SCyo_qezR?XmYqaFB¹TX(v¬~Ս6䔰i[?zf\-~|W\*sx?YiL2"؛uqpĎ4n&՟ÖSg8! L~-՗;ݰSI4T=RIajHvMނ6l + 䪞IQh0SjN=J7p_3h.C,,:n#`B|*-x #_یzȒO ʜ#Q?,(w ϑ Dn_h+CK&H= Ѕa,5=GKIE"sW @>Bd)owCD:vY45 Lp|א5l.9GM~~|5_7ӛ{m `AYN /% 5b<Q&\v D1h/䤗ZDԞTDz'u|QDYT4't8%&Ĭ3^}lι,=ӣ3 vMY]cx w,>α&A~Q܏}w0wӑ\ջ {o|/Y~ Ep;!?DLjW1{|jdAGhHy^ͺ_Nnvti1ߛWX~w?(`s1NiQ/sE q)%+*sr@_ps%%f?3s'$7r#ߒ21*(FQgqПcrU TO^t /)i:$nn 2=fTN|MKJrޥ7r'ONnp_j3?,;Ej]D>:ۯ]*UW WSObBݤU ښFs_E2p\Bު9()-y<-EPZU{'*ЭE~>ΈĿU\-5@c@TQS^s=X96LVMl+gTJAH>Sǯ;n;o0;y`rEҁ%T+/, P.CoҼȲRëK^ . SA0dF$ߢ#UP|I#ƨhFDXs6&i$FӲ/8,0xQ FDLވ]ѾN]퀢"*rY / TU;`.๼Эvr {۵gԢ3dJ,6mGh Y(,gmzTbgH RWfZqFV^U( 1``>Z# dpwA?Ђ~r_A?7%A? h<+LZ%eW'E߂~%a'?QjR.v:S([VoLVTsou[#]T/Wc͢xXN՞Q- _eG,2J D7'n]UyJ:UV_0j ; ~֑3yQw'ڇEڑM4X!)Ta\y*;:p,01)x_GԆx%)px6^+mw6H# Qw Ua #Q`ڍKӧM.@^M z.nh\8d)F+% Zpҗ%}q4N,B| 3NK5Ngk3TF"'RXp:a}r>70I-R6xc?7a% :ʻoo)GwOoViBj[ px 1Fhk(8^_\~˛\=22go|S*a l|@ ^YH^8a"G.?Dc\i=NHr]3\u2z]​Sʨ>0aGw5X O>-W VF_~fcoF~yEl MaaRUcwWvcdk̻c^ЫNOaڸx4ΠsdAvMqpUo𔏵~%F[/b+aDbElb!"Kddgu$z& ,QElƀU!L #f3ohֆE#eu*X^s_bZqۂ *C*< BsL0'i)I] y=I'N;>t2fxR2#)BVCŽS%©TLJ3[G˫{s-m *#;Sx'ô *f7m|si*A G\|sI8iNr+@ڮ|qd 4';͋ gHϲ;o;Nwj!@Ff%Yf-IVCAN%dk(`"A#!x!VcY)JO_vs@Khc#q;rWRQ}@Tav=,iJ AtMdr h#9V`Ptܩ&cPL)*x&`:=\!NdjV'۔w#8{LؚS6žD l4%M^ e{m 3>4ھG!児UoN_2]؊JŚ6걏;Rc;F N157PzODĐmTɛD[2yv;쫹oDA}od#s A˳ Ozۘ+xѻIaٳ{Ь33>']^2:s06Xs:=F qt0?칟dZRH'^D`M gLN'ͻ\JHl3 tz*>q|hS` ˙R*uEE 8r*eJ{BB: ydT/U&L4r)qQT,@oPƬ\++%/nXyRZ(C%%YD86WgK~TG볦w2Chht?F l:4&F$ap hEd(#RNUxʃE[q'qW_7fcѯӛp ea^̞] Mɷ0&D1#|t`FM&!8d;Rkݯ-,d$2 q (Y*jOmo 0I7|z1|aQ }WęQ.=u`x㹐IeYAIжF;q3YS33Kㅬ-j]DT^a^aݘ+f{WR Rت?/~qb/?œ/^J]3DFhp!٩9,s7wPqᥕo~۸. 4¹(,kqPS3޹~5ֳcn0KOBp4޵b-ҸD3qSDs_\nQ8U5yפ; &rӐԟGRlߧ qEy>DCj .֑ Q ]NhHä́HNCrxcikW yEe a U=-h<-&ns9E6W zn)1 P |k(%IF3RfB{Z:_< ˠn߶!#q6ddBFSlCWmVSt;_9tI0CH Q!Վ i $T4z˲?8@W?C<Ҁ|BCPVtJYa @&^S*hb^1WPAkbaCmr[m}u5''Zj[hXǶYS S;NBI4qh n~lAB% `vbBn$Є:GϨ"bt+f Vv^_Z9\\Rb搤!F@%?ݬjAȜ *4 ~~4q/_83q:_BV zoBc&ZQx+I /tiz;<˸$V^ZlWKIIj^gc'{.Vxdf̈́m6ΉNVl1wP9Q@SvlQz(Ѱ+ugzX|mrE%i$79ƫ"/h$n1>}>̂Bw K}#3L^M+Mܵ i4"8>ii"Z D؋ç A 9+Cj4+]y'!j1JNٙJ[AG߾w1[Dna 2!3ܽi`H"?_&I^M7)޳ (-yw,j uh G\ĸN3RF])罍nU[]4˧$|Ĺ31" ur20@L5mt?9!oE@-:/ixz n@L\>9DtfiW:k"6 +{ xRŚМ !B  EU|_˫W0q@V0QV|-h+ QJMe-!ޡ9kuJSh|#~t8p|6r;b>g;Zr /8drp{# ϬD]0W!cm`%V~S Fc;+*B^Z%V:Uu S^a6%Z-ݛ%lsvIT I f&͒.hz|2W,iP]xI>P}QrSO#~jUCIR((yXyao8~'Q2{;ߢq8= լ ?;- ]ѻGOϧo?'-OX ,<@i'%]]Ht#:$r &Ar { i읷Ibǰ>L8y5„p{z*jy_.|֖ym*7ClO=Ǧ;R#c }ȥLo19 d߅p?4?uz,s!)Uuco1Oݍg/xznh#ЅdK6Zp+܅VEqڒ?GJFe{W,IE9v^1#ƺ)٢*Go=T(sDZ-&uE!UpB~3&ˣ7m*ude(aFx0Cv!Ɏ ;ǩ$>ih_}ej(8yϣcgeac'PYGPS=qB`^y@}wmhp?Z]8 K; _}vϢZ_ H}؄jTHi2^v9Jt7 c^j%i!-]YTE!ZE! z\0 R=Z 3 oϤ [KIFY&:/3cH(բ;%hzI9eHGy8hBh՚yqdM2o[]PH,853!<Tas?1 T(ID+GͻJkƉ:MPIh#KCB}13rslOgm'Fz ":gê>-4ҡDHMЛk1cVD7&>,Ҏl"@rF[a㌲ kX%-+e+Ýp:tഴBRnQUU%UdVCl&--"J!͹G<҈|C{4=nLPEiEn'bxkn Fzs.nh\8RV:&!5 [o˒Ҡ|l3NKqhF]rR x:WE3R?lz&=# (lp w(@Y8OQt.1bC\IS^ITvF`R+/X-IZpo,(8JAU3䢄n\22MGX˯eՆ6{$fˮwEaN 2-)#Bp/Fy gʚI_aŖ}L:aɻiU]3Ѐ$Xh>$[+iw9NkxrDXXn8]v27ՏΙ7k0DԓL$y0f0ÖXCJl !.1OV A{eh }(yTj:ٝn<{.hSF7!CN6xWGCKdu(@Qωk8i<* BI/R_rI. WT9AU4K)$)K'K:˵q9 P~Ee^+ XWNƺ2MX1z,|!$LJo2yMƔ(D y0Cj,5׻A '(FBB+Q\A8w.5dHC綺$8#;#py=G06P_WmhD:o#>{ 6޴Vi`ؒAIuU+MoW,Ux77`l?:onX-? rR\ΓkR96rztvw \?Vo`"'oZ(>^p`]]t C.KqՄ ӢIiGAj//kU6N-^1PȌyL'n9eZm-| Kn.7\Mm,fa%2 k/SkGF?MDRэL\'`M\Cd.>/3VV+^wOڽird ^2Hmݞj.C{BΪ,ztzd}֞)5D6<@Y^LENJ8XĦ(E=|jV|"ZU^TyZLPyi[}]ڶ~wQOF+me\0=0%T:6bY~wtK{=;SW:8F9!Bh\kh\5qmǑv* `=R[bקzT7ק:٥Z\wMTBz286ՠ5eE̟!F^yGӍ+b!<]w XMW}+=uѪؘ!2TN~rw5\ FLUD BXW(?6mնyhn'{ɯ Pcg-I+&2%9T/ڭ)bD7h^S{ns-#[ELIT9DIF[SĈNoTnEH&ݚDj6$䕋hm y5yF^-~c!2|3VWdl1v*"{| S\~1wv!v}}ǩ.Smv鿇zU!5+x5}{`)eo<+(՗MI9$v?=BdkNg^ GTy2@iܶcI?ү*DH6mT" Oˆsa4ND1 :hMhMɁof!-sW5bZO B?TUu2 k NǯA <%=^ɘT:Ln(%Fc@YڔB'J3e BDRb!(#:eX`m-/GSpdF 'lgR7v-Q VjgS#cR4HeJ=x>w8 (ERvrYȴ,'RH)؍dF( N ?9*%V˨tm1 ImhC[!󄍑)ILԪpy76^X[#C:(_n)W82I<#$޺guW22рKv \qĉ?DG@/"mP &YiEp4H @9OBb?ybHqHb9^(&Bp+?Jǚ^ʘ@+Uqd&H>[Ib !@U&26h!@5,%uXЈWKT3 )Lଁc]>POw)Ch[,UcwᔀiOGpspp ZYSd]؆?YU!_o4(?]|f~hǸ^{}5a_x,whw;6.jb asbe:ÿ]>}̭)xo?|0A;{?eYpEZG8 ~[ l>٫$ܿ?|<{alg@]~Vb':jsp Gh MPRC0k9Zmz̈DH/,OaM(&.V1k+lZṇ~v ;2C l%ǔ؈W%Ĥ1]AkYt=(0b-`^LوX 8 Ah\|ܱK`EōLJG7#Lh%{0SI!%.ϤzZJS.4.nYˑ޸fمd376|.;6Q4"Zpe2dԗ\9QW?)_y)zP:cR)~ܣf!k5`^%4SöӋ+PםXTk2hWjn:\j ƒ'TB[sδjȃA Qiv=7Fʹj|δjAB^6):P-PqV(QJydGrɿ9 l-2RJny)'YJQFAJIZ2  y%, { +;T 2!x{8e1c%s[8)"H)cT_ƈQJH]J؄2JB[jA@AicӬ,Y kaZ xbpcXa6N D4޿NĻM|v#PIXd:7xZ)gY''(E|VMV^,+Z:´I|}o; 'q NX> n>]U /4_쌯o食29N q42 GKRX8f ~p9O?OoqF&plٚVΪ"Z,;v6ALJk -0>=r1bGa<č Nnׇ϶I(<4x7=>;\nH//.T7Ch 50nr\~)?n缉u#ZuЫk30mya^{'V1&. ;\(ҔƨM%Oxk_BNWXd}1#l|,KўƫkTA$WiPq`Nr߿?Y}{D=)ǩ.S JN [~Z?ibr((Uh>7qźX]dg~SϴB\aϪӫ(1rP, ,B-R1D^Q0\l@{ %)4=V4{X,LK\DrGPE?P+=yːкIuad'!kmiSp[y0n}W<_w6ՒG/\:Qge˩1_Lkrքvj.w&{!33&x׹L?][SJ9QSo[01H"[2hMd#:NR?,߯|ĂؽWһ0 3>% ~ɂdJ2ipA4+ uc|o4#ʟWkܱd"HT"`y %`p4̱&G?M'BbI ybɹZ@X%zFTi?ׂꩦmZvs3]QƫS*8=h4[s7f^^Ep?@>LEׁgl)BgT(ͼh0ny-W'EUfQX a~꫰$%"^Tz:lt wBȻptuG7広tzNnP㝞vǭS$ѽ"xZngV!#otc3zP$GX2TJh8 !&f}CB.C0 QG T hG()h!6^ZE* 6zLiz3:(iiDoЍV@7Fkiٺ?f#t @F.FN n6pƱ-6dHn7Ӎؐ{͹7rd6+"P֫Ԑ5Di(T 'Vl\+1?%"b GH)U&bz4!M*/<k357bYc eYfؓ]_;%0m˺qFqO6侀@yݯ͖_/?myE&J!$_fHew ӛn hѤx\w˧j}'&[I^gw­qQ_ads*=y)0l%Q!98 ùph35iit 8%x#TaO eϲ(q?I(QcLZ!+l+Lܺ ǟ7D7fupXpaw!GR/ky=âf%)Y aaS^Xz,7(xn(UzqT_$gӰO3)py3f}WQZQ^(DXhdݑyX-HjM.U?|_h$gMW|[}IyzOuh-yXt260Y8u6}x\0f"3frf<9RD+&SQ"ζ]0:#9-G/ :s?Dq.J+=C~X@}Dtl/HEbcѵ4(~1݅̄dg~F~F~F~ޖC>p~ᙄ»0?B+ŰU;%)@t58w 7ϰy+6b~`>fkl?,X=c`.<-y+=ظmo'D5B(oJz7$ #Xoyah-(B5EcsM+fE6(FE(eA92XRAPYQiiSmun%* 㲔R_kePb䵒,5Zv-K󵪻fǫcT ޫx%&^\l?fG.Nս^ݕ+d1rG~+NK%% F[mX{tXSH 1DyP4^诣[O}?om=+L$hkI~O/0Hʭˆ.hU9vOcț;.u$$یii3m(FpQ:7W׿il7y|uuޠNE=xzXѦԘ |yz֯^I%P$#J z)5 p,2TD_ z!uXY 2E/4OCR6e/,=KD!悗Z&Wbi|wn a 6ܹ< GiGN@YZ%Vsԣ-UenR7X oZK1:L p,M;@4W,%2QKR4R:ԁ^RL?3z,`=yXѦRjJO oVl{QnEI^GR:S o] G߹@flIVR֮ d{YݯT_`d14lݐ3t ڻN ϼ20dJHYoT6x33+N򜝙-+tW^?q4 }XdX%|O>DdٺN_8Y1*SV ˒Cᆏ]18Pt)H=XAG.kC%L )&% ji6U#uj#  `c!w)!s+y$1FP8RDRJԀPUSAxp|t4ֳtZ6vmmYG q<:1\0R/ USckf% PϷȇfhG8F՜b+ic{`a ύ@60-V]~`6VS_lMDb(9HWf(rŸAaZV3J!$5GeQRh[s%(vOCBEݕQVO|Xxk+.?)YZ NJ*aʐu?9 ៴ʇ $%b;Ǒ *>\c]7 @7*}*ZTt}-]ytZPE3伾(4[е2@rHіԲr( RH'W5luC$A CZc}j\TU>UQ֚^ƮZԚ]]jut={nN@;Qڽ=o̜h/uT׎dǭ uNao?+w5bwIָ[HJb$m` ;t @](YD%΍+Ή-L蒰.>]'tqRD^t}st 9;KOsvwݸ4nq}4%CX䈙J4SI3fjC u$J1ag* * vxWK^WWt]$v~^L% 4WU6jjU73NԪ~d}#SU-)Ԣ)2M-Q\;EMiu%2OU-6eFKZ!-*UZ徳,1)pĬvFJ;' G'sM(# NKPrgo% 6MUS&v]L@{7&tK,d_qi#@}ۊ'?VSƥeUX)TQVnoI0c9zQe߹|i)m!fAǧxMoxp~:LL eA\]_n|7 pM/孎Ą6$r=ܹsy )J 柵t0Y#%g"& ꑄ*J53 CS`muuuȾb /љ΅wK]RD͚kW{:_7}Bizgb4Sll4D,1(Ƭ`ss.!RX9W!s `Y;ѐu;ᶊ1Tg֭@S[ )z4JS'*U7lɞrVя?%yEbǬXGxoq0>~,ct 1V15Q?MOz먂,8C1-Ł*-zv9KCu8,8'RK(dK"Z-B6Lq R61lf?OK (4E[31ti_/#}UĹ_c`u\)h!0oz12?t'cىG`_La/|Eo2;ϾܖU_@24E30ͤb.s)ST;w}@SZXZ\7%O~O|6sӌV/qL+Ҏ6嬔@F-khC q:6$!)ZRS-PJRg;`7<<)(M A mrj%d)#U|rSD).y&cRgJ9/:#b􇹻ٖ/\& 6%RiΤ7$۲ZNL,ObXeSiRMcRc+nT3$N9ORZ,%MU [~I5&GwNci-ʐ^^bMM"Kt/%<?#QR"}.|6=Ak .KRo ZhdlH I8Ö*bҎ`)7XLT_ +!ښesXLaܳ^אld#f]F)8 rBQ%GFVz"R}>ġa)gj…jpvI6(IgAwK7ȏ) 1b?LLjFKUU{| /} ҥa°1 *cXъam o%rΗ:gNjvUS$t E3[R;{Y ;.YYD[}s> "E` 1?蘍f63s7>cc7͓>M{ŋ}չQ~xqN{G6?@yB:p !: *FyLRqKlhr ipMLr)H]zbBzq5Nz4ӫ~f|]sC[DX@[x7H3 SfE{LH"*4ģG!*ěFH%C_5>`D~w:u@Jnn\4,5} ohFWK|4!8MdjDkNqB; xú0tB,(N,k!&))1KVK\2#L"L!)EjIr}-VDBcɜ!PPNJ%YT$c!,Gk*2!ƸH;X*O+ùُX Zexo i9haS~} 6@ /J/D)p~AF+^89hk׿Rg8Z 0㴮PU8yS~w2jq8+{\|:A5A,SkW=gq~2>B/XśnϹH~@A[8_vΟO?d_a a8F7tw0S|  ޺?éGϯ٫a6_({ Gy1s/P\cYY}}3f?~93~٢/_ o^O<)YѰq=E*n}|v< / G/io3/G/`9Y }ɼ¿Na'U&T/PgšW~{5f ^>LOfr,_Y^@BJ+[:48_}w+(\)|"3涛4Д63Z,#V7?Lm#ǕB_^igQX' V,RM˹?C֭@rzm,@JjR?zVh-]ݴAJe{ʕOl{fɀg9]mef2BabuKbw/!_uzh Tƀ߷`uuCL:\(bC=$n҇vcɟ&`b X'28J%X(iRqU+tlIY1\z$^} ^^-c{v^kw^/^}LcJAl%y9HhFS?#ADDF+.__lNYɂ7Y34adv_|/DŽcNaV2(E+GW> ިO2GB"D&g0n tA黁|öYŀŝh_IL3ux%[+eYa4*zi!RXJq Fʥ] ʠT4lbwQra+^NOXh=[g_ &l2D>ŕ3ٕOd}Y2魗gug^9Q<,VarqiΟz lz@5J1 EL!iRESQ\Py'"֚KbAUPgκ`M,05+EH7iX$>هmjU4?H$E+N8=,uHGS|Ȍ25w=$,#cpp\J^BlSiaYDSR07̈QYc2b4A0q::Tsx(wFoKzzX zP¸h m*!}2+!$[xʄ;cKR%!ϺS,:,ƄűoxE>4G'ˊo%u*K4Jp#&R* Q'.q,Ē5" 1$4VT-5! ѧٕ7s@YF㵙f7L2+O,yGrԠ2,u@ Ɖ꽢 2%;3<0 bOeD "b  6MMpݩ}lǭTU}gOHaF?rR3D0)J_1oq%+P'AH8RdbBXq8cED$:MUL%Vk*nB-OoVZܔ2@ҟi99O 0F1!b4G + SV0!5lhRW}X۾j;Ib3'r&ZNmTJq[y,'fd@'90B~ P"}n)xaǡ`$&!6 'S8D$7&pLa ~i!U= FuۡPҗ s0ʜx$gBsU3 H)a:PQؙvD$RTY"}3 LJ0-1Oj ]mraK4M8?+YmOgBτY|:35xZ#ۗWyl䳳"4٢+5ףT{~SB * -~y~Zt0fw80?X VUʾ{z6q\JpV3Q6pyP,z\p&%άHrj4֌'ؘa u:-vF|onzh Ϻv+ZW,p)Zpyk@VRĽixz5Zr#D٪ϠMÌâ;s+jn&0'S  L0HR%N,ß`R$Ny찥ʰX#޵5m#뿢ɍ;yڸ͛OHt&ۑ4[ EQ3˦jdh,)vgÿ'wy#6nr}g]-3Qɭ.W^ &>//\t^؋!+ 8j?Vs޺*8PjeyazM}$y(#?\^rPMRAܘe ('NގqZG \$I *]AL ]B Ib(g )s!/X_o}8?Rzw/wBȷkRݫUX{F)[tp<DNPgCc;{RO +t5Ynlu#tUuICLPu"Gd8k<,A ]%V=Pۦ6J^P}{J Җƫ3YGb74jCS Lk9+L@ZQ&81s6l&C(ڸGX3h+P!G/[D@=udrjTm铯/I](6>5Z1TK"MJrT `ǺF V){87?JJJz.CjqhΨjgl.}gf}U|?D]],)Z9Ӻگ@oʳߺ7fU@n؉l Vhywf->Oݠ=iY7TYgY 7(jϜ}04. &n21ox.SNFn-p)N><>nSn21ox ol%n3 #“Ɯa(~7en4Sgi._`fㄾ-&0jj$),r99 sY?R"99֋G$׀>|sJ> 39(<[q#xl)MX?|ig9 {Mcrs9ՕNM@jzBH2%Rϱ|]BԃjMץF7bc| 3gxb%d-$瘢$>>Vق20ٟb(aN1s LTHF%ʵAE5S ePXOJ8+Bu+^}n%ZI|{8Q˩(ghn)ʜJ2hFY4/L"+8--Y*e5|?{Sku?P۳÷s<\ӎS`j>4'(r0>/{jSzQmJ>Zz >TsD1zH}0j1}wͺ$c4*Ck.zTO?ɤ:c\>% 揜$kVأϨ`±ǰp'\+XLР3b'v|Ŭ>n?߯F ާCB&'l61FHah H9!K9'0Pß+(F# fhDf0)Dϕҟ AXnns #AP[w?ٔuu фS<(YcysN@D*a&9o,bl: ~%?:+r^".: \%8<:n~Lnu3vAX HW&AnzM/P.<8D|AKwImM@F aP/BD F+MX( T%\4qf=U3#r(RjyAs3$BR=<'˒PZd@!GP_HZUt'ȸS<=+j[U?lu9n?f@`FT=Tŕyuk@MWwgbZoTjwyL_] Ņ€. H,T3ھ1>0>Jfv6Yv ףx}=}4MŸ- 6E dnz .$eneD&A&ZM=tKjFQހ\/F&# F%HJpe,%$=h9Ws$5{Z;NHUhℷV\$e_ A ##{e',qVR0T4)\)qᬼ:3+ϫcxoA@0  0  {$E5 ȱF,+kQM<͈k'*|ǬƏWןm^3'_*)s'Ζd▸_OA5c4]}|}1($v&Wa(8a*Ѽf9syfr=ʵ"VIqPyugg`q#v;0K5G'\?i!B +T륦#xʲJ1՟iFC5zu^PyTAU|<%/S7zLZ bzP!?+P@ 7#hFM ǦZH<+ HfEeV #Qy^r20ehVĎ+BD(gÎ?~Kt庢g=v1L+"Z䌉L`)d)9\r ) pn+w=v/>``\eiB 茐)(`n٨E \fZМ(T>DS_Zl['\j@Z+J2$X Rs{,%R(U ݚitF!a~rb*Z0aJz^BI+}x!/ \q()|ANȁhIqZ쪷0!ݮRaJ[!$ !i7;"5W0xoۃCn y VVo9o$͂Q~w.׿~]$,lFf@bNQ*g.zLz`=!n?R)*Fѐ\9Dw'(f K8Z{bSxQm >VxC?^l-G㕏,#ӣkD#udE9 AAQ#C̍ҏ]憉@3H b0uV%P+|Qo%ђ3HYT3XϪ@+*Ͳ֟nti~8+33ڭ,g[f""u-J?뛻ˢ.?MemZknl~cWֽ1G6ZOg:mԘ-^b6mW¼4뻥+ެBul*ƈjMmzͺ:B^FG' z7"9w~#ĻMY6*n` OB^ؔhNp4N zXN7Bۄ@SQ2[<ӻnmJqAico=QXs9/Z[R4JvߎV,2'\ϋzugOͮ|Vyov?/~7UH.fvs[ͮOg׷o߼AX,|SlogOT|m?.,KSޭvP7ŷ(!^uO{-sPg[t\uv!@ULRN3sQ2>k~ #TL2Nb4ufPMo] acȵ*z"ViZd]>" -g0܂1lJ_+&V tT/ MEHS3ѡOF-BRJ gC%T҆5Ǩ$*IBs>btǶQŏ'yS.dk˹ Մ&-l̎ӼLedNDZ0*GcG) `^V]7:n^:~*☞tKD/tfF*ju^atn4cZXʱyah.!f@&PLjBE @lrKvJ(6f=% JL*2ynWn 5 Z3$1Qb:,6w)Jy{-SogK)]/Z2\q U叛ߵȔd h4TYPA nKV e kܚc1j4a 2& e2̌<4D iDKmSjc@Yis$smrn*oԀQp'Yik\YP#F 'n@AAd 訣A|UAe`{}Ќ-1׀r u~ jMP_aRi@Z/E L[.~Ƈ&^mP!3$^m۬=RDb w#\^F́ e` 5]N ceӍ'! &K2^;aT y&cS ߩzXN7BۄїһC/=[ y&eS|yT)G v7Ox7jFmԞB^){''nFwP3:7iFG=auI*+9$dWsjs.F}+(%ժ"C\^[쏪nk)T~tt4 +-(Х2R*ȹ CfV0bqNw3N~fnj__9.vyrl֨ =/щ6=/Ň}] 8wf,͟K|plefnrq{kt>U\VZnc[i2[?{WHnB/# $WGovL*<ꖆio/Pb$CMĴ*e"K|L?黲V뗽 a!'أor>u{Ɛjz-GIHߕaV;h{9 C''^v=ߒd(iHP”l([rgʠ@ԼPеdVRfqma|9fkQ;8! ݋Xij7T=P]ܼ3Oˇ-E|<rKU|7HK{Ӕ;+_%H@,X>y'FK-VJ=Yi:tJ{lJ('Ctk)< {p`b䚁JNH$>ioW$"J  zy\>O(_f}D[83s^v$[Cp84H`$~'1xXGYp$T |"LʑP,pVjhChX9Zzulʃ6U?n:5_0NnFp=QKGlht P){M攐4!Cc$ 3d"xeJ  +#gns>8BQ/sF#FѲtPZ.FiӠѦ #ښ{ocf `gA35FE׾p6ef*,kHcoh a͛KO!WL#KwKZwuƹ%XVb R%j 353*ekt) L%g%tBvEԅl諻D?!ɢ]ΑN󻢉Gl? /|[YZ pL吚/-EP2m 4 4N 8%T'lMWif@Y/N!a7M)Tvr8,oRG_/VzVh2 uVOꛆԂpra "˳Rϰk YVRSVKX9ZiMYa$4,]3R yYtUE')R˟7H{ӔB>Ce؊N鞤cHzp[Ly}@Z=q{4Kka D5U6tW3G?[bl DHX DլCԁg4k9@MRo.:#Լ>梛NAwgTB}TSec9p0: sGt@"=3D D[$ "2mcAm&&|X&,N se[D|ia;[j)c:cCRtP4 -HpEr5zMqY_8v LqEk yfWxmvSow!)t1zԛٷWb AiQj*Ir޴Ef#V FVԛ޽9JN5ӊd,ĭ]ܾYZ3dԛ]L,sHCpiJ/?ЄK(XӶ)ePorz;E3 $^)5epIi=C,MnxR$sHĥT;?]\Y+E [`\IU!5dzBU#I-yc VNpRX;*1iG K}L錥{'+"fjMPoOΘ2h-0"IhX|KɸLwkJN$p,L`LIj7cZ _ Pm914:EZcM\wX|p*bh!$"#IQS$}1]jmEIZ[}g?WWi;#?V|tq Q(^~7>z`kQ=yOi[k Š=_W7W|\F:QC/ 8T_սY,wyDߕ\PpFB_ǘ}>JLi!nggiSHwEUAw9/$nA\ưl:]wtk]k\32Vb  T.qIK]+p os| DqT]%䄧A' _ ]LH=U-)&@+li(}\1%A :ܼSReՎ#[_ꛦԔKyV1R cf:uJN4D(#4W(ix<)Y˸eyFbRI [R'1J˵@>X .a4aVRS"/ R̜0]VI<=ό8DVlRΑO$)5y Z8cy- ZM g3Uq rsn5͢n .4n2e l,\"\h'b_T=z<ƚpZDq<ZCf`F,S^X+'6ǬNƔ rI6)=kӥ4qRH%Q6;F%:ߕsZsTݡhwM8ov{6k.[/A'$wEGRXt`SRWSrΝf[߮y촪_nwn';sQJtL߯Sf)IV Ɵ8]{5APZle`$ؔ@1|vh[e¨f@KXwwVP5Hgb*bs"W{RqFTPL:&y:hot^%>02%e3IR*BsoUlޒuVIz6΂*C8`A>x^X$Kc.Dy[,_Joxxe該[k\OPX+`[a% VXWQ@h-%tLL`fOpF'Ow?wOsLӜg6R7Ȩ!ݠ& 읇63d4;dLv?HtPXӦf5m:.Q_z|*N[kaN;تZ. vtb13eZl 0Q wMHL o-, SG^w:Wi f_âɄ Ӎ,+8tB.C+P^{Y Tn t Z Z! _jFzb;2{{ԌPd[-䥔05\免ۧ[Y:bclֲTŇUz(5b)㠇UMT =Y[2$[ ̠X]b)%8R-D#.5F49xFj^+j*z!SUU5}Ͱ?<-1tpLJ R>,\=_\TW}*8ȃ9bh|g4xD^maVc7gr@k1ЋeH'S&!}~YsU)c`Y 6|Ju*,>v:Y8oSa,xWaJnuT&w_d'ZOb J(~Ɍg1{SnFz|^=_gbQM}wvX]rZԺTJB"Pkӝ:z*z~SЎ}ޝ~r djv'x4x8DB^rxC4 W`8bj6^2mB!4/v}%-(<I:nSdl`s6!I6Ip$T׭tglqT: ǝSf}ԽO NO-Pln3qFe:=6a`-Mޕt+1Q֧4deQ9[͉ӪIr U;5ϕg:c1@tx X}_?_nQB-cHihQ k":Hm ,!Ma+<)6;99sl5,60(7G#cQ#}QAktgDC,"/ !_j&,%MCu&$µ @l҇6ZEV\ͺ623:Q׍Fj&2εA[/\6dIy'djT74h\M(gj`=pRul¥|)M5 S0vk ˙64lLp%i{=5DV)N"%)L*j Q"*,N?ݵ(c\(ޔ*>^zDdf"glYj)q*MV [#dGN=(kq+Gbv%1O9/Fnqu5 Ar9:B ^f P"#GƷ-F-Զ6RQh (;a%kgA@0<φV sHQ@G{6)$1\SI&X3c)y|DK΄Y^}QM 57Gv s|}u[[OM g3NQyl'ML;28"$sDR8Y rktDZ';w<ҳ<[2T|zw\>i)QWzww^0w{Wu7Ϻqk)$Fw>,S!{ |9>,:\E9)gGoIթsn}q:}Q"Nw%*=t˜)݆9)<sKYh'n}q:}Q"NiE.4Im:OJ4lǔk~S {1lV3:nvLE>)W^~~s˿¦%Pm4DZ3(ۜi- ˓lBq X>IXF%Db qivbepkMmեmqo?2jF]NVMxD lza5 h)t=ʉO5DRɪuuСrPAze`]/xJB\'36g+ƔAK<]oo, y%FMہ)Z,pT5nv*hGnIv2()YYO P=ru6d^g뉝9|5ηmt]TXXǃj=Yj%@ ;EԾd [`0̦^[@b :>km*-fUSґj7pM/$"Jj435 KqU)C#R촪1Z!2 2%!ʎO'a :v }.hNDT[Ec!##@nC+#qW J=c2EDnC%D41ZP9ݹ(jJo62imQ`T>4Eq+ '167p4Z>*b\MHy A*kAj䓚X:92oGdc;^'qM\k8 5r>ts7_1H)Ⱦ7N?D 3\$KqWN*캭Kvs}W׏BiZ NY:yԑp7(( ٙ?` F{IKE:[L  Zܤ6"ăMyނv5# y z gZ?e$Gh&%1[7ݹѨ'\g=z2./5tO ;&O"GZܭH૟OP$koWQbs.n :%kO>__M<\5|j忄EҎOWWx&:J]?quS6<L,UuV͝>H6UٍvH'^"sX+QqK Ue,*BkjCQiJ۹cfy@a ;>]Xnzޗ̞.,(¾{5L{C)G[ ZU\r#ªan,3*PK]J7!_lƠ\*l**l՞wl˗X9:ה,zy 7i,pZʳ=p|Ҍo6.}dN eRfX`#tD!Ϧ mSuV޵m2G[Ft4NmhB|oON4FIʒh޴ƪl)nmǑ(]@֖gپI1g6o5i2v `5VY V i{y#;/]ē V:“֔kA}PIJz9"T+.!gY yzcxXFqPm+ri___K8)߳I_?VM/Bq*~~?D#R}V^ 0M<ǐHKՖcuIqoٻ6vWRMեOUVŭT|JdIG25}@Ii[N$UT >X+3cIc~09#IA9oӅ(G֨v( n?[z};kۉiS?ո?zGO(jTFF?z嘣cqhtg-9ZhԻeXۆMIN ]> U &`ٛ0;%o䜕'|Yo4tiB GfBRJ#HDL1 i&-$' iaѾb(L+Pj\ QQD4Z `F3 NO}ǘh5:)F%#B$xy'$ nڡr@Pa_Y,6 (yy-rY_1x7egO, ,k(U@ISaşQ?Oۓ2B"R&lq;JIlR/+l̜@;,Va&F1бV5x`iFTp6o"!bT_D2RUi*w?3چr8Qs“$BH!w#i9oLNkI0ێl+v&\բz8<HlSZ%qCu$%N[^sY(y93FP 2vI3wxrJ`IPnz@m3knVE˦K`>k/+XMaQrg-a н I)w!9N{b݋H5ͧIKG1FsBhZݱsV^k\:? %C3=a+Ⱥc+ȱE.W"f(2-qHLxnf '36|xlS]Zr E/@W)KQ62T?P9v^idrp3ۈr~[YEȶZ!(b0hC&Si\!p>BF,$)ј$rdmvfjA| ^(ҏëju1\$W,UfaUH"/R+n?x(.-#iw66l~> zI L3fd5Ȁ^YY55R  pEKbrJzuwey&R4,bJUȁi9_ %4YY1 7Cч!? d CqYVNQ*QVY@bGTɆeHVҙ@Y$di‰ٔ} V &-lB- 8Pľx<׳Fʱ؃laB~|u\,2UgS6Y? r\6/ٹxNI.nƯ %Q![kEdrH:xے?<# N>R Qܮ鍊t΃?WFKRzNXd- kcp8l]:=9F cEg+PI?co, $URr(5Rn|j\ev;M#&9'vIU:G ƃ~9Fo6V{1yAC"ԕ\?(Z/J5{QB)=z1>5ӧõnb`,@YcN3"HF$MR9%dP\i{wi9}B b8 BCnQ҆K:w_>0N ߋ۟w\|tsy598Bd^k C5"Csݺln*P|vL3m``Ȟ $%S*4He_ӶeN#؏e'2s!%D%JXw9^o&Nw> IE$\,L<2JJI!ͯ/tK9vc3;_~X_/ C&X%eOfNk*)}!u(M X=UY.n&,ΞWsT-!.w<6s쭽ͼfX]miaj]9XFPJFHL*+ G#{4_ͷ͜~RCNDn" hV S쫼>PdƖ]+HKAO-;; f@bؽF% O o%8:O/'/p +"F[Gn֌]d^mLYYՃyL=7y=(mqsR^S֛u!]M[J+[rڬ+-=wp]KNIO?+_X2/lgܬ?|b,o6Wٴq?NiWA,BU U뮓['MZ{mIyI7S~U7Vw[K, "5=^õ~j>FK=OXX8mff*<Ql+x:?tyvof̉=g a1s[p{I|3ߡ2 @rFl7r؛FI-pu -ꮜ^?ZLV/bzBBg2mz"g9)i"ewx=WSnDt6kE6!3 ݵ*~_?]xwb]f\sO5 a}0X8vIf^(y֯,pZ#jN"$c ҇R=,_Ғ'g+K9J:+v-g<='I|-Zx;XFEP^jyVƓF gƶqѢQؾHhϊ_76t5,ZTy1lA4՚hAQ,l v_j^t$ W~MדZ(g݃(7mƤ%1z;uU&,6&~G!` V^2#!ESb!9\'&vKGC|dW>!HT*D^FFFҾ䢥95OwPqo/X؀|t^8z~YT Ў}atë]5~tR5V=/B t RF+> ڔeYoĦO+fXY"<a@FڠgZ_wش~#oa6m JsLhKf E0! %VźaVaff[FҚEGRKYK X3Y٧ L UlUZ2uy =E I( xMڿ47dh =` 5_n(UH 'ddxh\+SF\KХ4pyx ZuSJ|iv:xq19NY3qRɨ';.t Uܚy޸ z7:Qpf'#3@7үΔ~L]oIH/c&,H(3Az` 6g_B@!w4p.g¿W5֮) }, R|f[;~qb9Q_pq ϸZ!wE(qz dg#ee5Ѝ/F8,(}!Pi>1"c;2j'\8zP -4A>;nIQYC#g*RK& R,?Yې#kK^: [Tyϗ;#/TƝE!8H| 2d t36t>W‘yƵ00Qqr XRuZP(-ټ)Ц}"+'}=HE}YW*ޛL9ؼԑ/_D?7yҗn^+kà>`so#f,g#No r F Jkȡ y+zg,^@O5 ۓ;>&_ z4FƳh7lw P 7GX@q t*&U@sP)T*9RE '#} ]`5}ꁘҋQj4iL>qr'%CΪ?ys4M!:r٫{v8}e??\>kIkVjJFG~Cʃ2r=q“MHT`L9tJ$a֕V^X% 侬M?_7+"k-3;U_йWlx^Lhϓ]D?|N;E~N(dY>pi0Hknkh$p[pB}xVsMt&瓋f4N.~N m4=?狜?&4"^HzҟGE4h_.N?^6?[* \T8_"tzxYkIѢYlT"I0`TB7F0ƄmԻ5PĐ3*E`mkqa[J2 j3n ETymm?]ߌ\Qfc-Y K ׼!F^0'+2P`,e^ld- %=;)"pf-R*ؚb#'`9ع,TCEE'F8zCOKFFG%cCpˤI;%6\\V*''wvu9; _RT_mû ^-]-_Zr;+>Ԓ*on?Z,A_OO f#VTݿS3x#(F ^-`G}9{p,Oq4yx0Ԫd2XO VH!`r]]\XutiNq~ 緸]7(`,S'Lmo6q`) m݀Tx`ys7TS owKZw cM+R%-HKTꓑ7¢ 1>e5/TA(/xMzkF0! k*3 F)Y'_FNaġJT,u1Z0mgZ_QԻіrkБA*/ucQEMİ0 ;޼>gpt}1Y5]w[D$G OvT󭙮RBJ<\"bŃWo ګwtYZvJ5P׬-!(X.[1Cſ8<=ɧ7c/W4]>B7F7Q-Z~*]}pb`9? ?JZXݬ9_{0e$.4ޤSܮ-Ɨl F*\O*źR}z窼~+ݪ&G擲|K2KPS~0R![ U3Tie}"toWvWvY eeScnX%f֪r(yh5 fmʷ5"{>R?]Iy['m?kt+W _o{g!yxւɕ?Q5^?:(~PN:WGP~)C^ ~f&W.ռ1ٚ[,[ X^ |r慠qffs(25A F,e,U mi~t\ !Ms44ryw֑|*F8OZ7۸֖uT3X^m'dִn]h7tJӃwWY;֖uT3X==(S;־uBCqZN곅SE^+su9L//~/F7IQS|e"`ox]x=Z1e{ %OS9#0rkZdm%r׫ܵ6̷b@vLŠH4ez"_twB$rlOT[$0"aB^ooeTƺilOHyl+UT҃q"$j㉵LR^$*u~5RhwgS`jgq ԓ_̲Áhʀ#&l3(M۫j۾Q;e d᷻l)=C}'צL̐ZJb^exžt[sv(m2գ9iJ҆`x()ҳRe쒄2 \ jLhՆF,`v]sy׭6<`JMQ\֙#'%'iUFV֍Dyzdkazldt(ⅰQD hF'0@P5(*FTI&ܢZdVJ6al()l_F5Of,ֹܾĊa|(LՈߌ]uМ #LUd_ϱ0\xQm(4S;u zbH% JO'@A D z(.h%5hJ-@ C/ލ$P4hDN=];J֏rjjRsѬvA d䓏( 4Ϟ,ApD1.Ճ*"\5 %X42%h-5MK٧Q8Jr 9~b j)3zr)ߛI\6qL2&܋xP멮rE\McnhZPCh@)[ ; >P@.fq;7Z;\nѻ-B&xMx n yu]}oB.zokwPdAv6[cv߶ dZSIT#۩nj}xgxM%}Wwp[ڠpO\(y(ِ ԁpbOpM)sP0К1Gҽy6ߟd0?dm"onWqΧ⚱@VW.lF c/R۰fPeKcMm@=hn4zT8NS⹲z!X%J'2@w $ب"F)ǏږruT4̈rC$-$TUH)=U8&ZЋqa^}4H[zxޣz[e_ϯG2Bpp:HM0^kv Hj)%6o2A>4J -UH= ȚEDo?e-P@iFs'ft>/#!{{r/cUʥN$PH@=C29:O[o@( y>SpӀFk tuGcNqS_f#Nq1k%L\tkb8\jVӀ2҅Qv{tMxSj؀,Tt{8dm P',g/O% DB'-%%/pR ,g26m[G}(KO>Ut$,ġZW}оWkHbדW PLy b}XYc}ۓe.!H2ĄS3 G@Ddԁ {6$ltQyD^n`=3WY\QLRvۃ$)HQTxٴaG* ;kk5E7|Cܫ> }QOz2N  ,Ph"l$1@n9"*Y[|JzT /?Ei %qK!n S]KM[@~|")&Wg9Zeum sS:"h^Zd%\͘+j4E /Ř>'`:ɪLvRa9nrSn.KӺxbzh/nZQׇujR2dQ9;F)KDYzO]&ɔMX0{g˺ F">& 'H}c#T0f;ݪ3 J$߫햃i')&RyS]j8S'}s3,nogv7.U'ˠϪG77jm׏ AT6}d7ưM5 ^-Mp%h8gɼ}h `)a|RCHKZxQVqa%jVnem 6bG{N<ڈGd[qfP50ٻӮ=½y&u927yITL%) V A) ET*t҇eґuZ3ۤ?H5S2)$9cp yʵ@i F(=sA꒠ rT192]j('|'MV6Ӆp);9gKI.e5Mu6'bvy<8oҖ8 VZ(HZުmNSZ1߂isGg5>Vu?]{&Q lNE&*$Hk`n-j%F[XJS:lYxtFa@dx2 +K.uȶ.ѲڥJϩ d.(<QVFa<тR_28'J,$'*>t`Li-ْ,Az BH&=nМ\*OԋMFWiD.l@Bh}䄘gk4є- ÃFchz4>͐( R:ӹǗ 0|GjJے4a+õK Mԩ8vD_"˚Yiv\Pv=(eԅR%hpo|Օbo0|;yå]z9_R3DԈfDm.tbU+YmV]j @( ˩|7̅Q{mj0D"۲FN#9h|rl6(Sb!LaL0Kx Vrc{Ahpȶc5;bT3=)tI`G-5Z(XZb>RuQIe8VP?aFWu:o*mVU8qg |4nn;7wۻ볿Idr2%Ҹ9/L.зvWL+O^Q,Ʃj[ܔ _^7!b?w֩+<8Lf`3g`Pm>xE&9(",djJC7╪\tjO"𠜣€], #87 X%!A0 '园|kY MOwKdj$372$_*'߫jCjPZ8E9ܘKpŗV%4Xei?=ݔ?ߠS=rj/Ҩ_oog&Rlޯ~;曑_&_-T%ڟ#^=PjANݽrg{(I+MZ%ӚrKm[lj,JXm4(L?wލJV-8P9%H&ZZ2홡N7 RF_ XaRP~RǨΓRRRMCJA} 5ӕⓔɓR o}J)<)BM99-y`ӺT{QJA}d􈥔vuwu6RBXM[PU,rt {Z 6Goo>؃@W] @Ah ВAwP Qβ}m=PG?@iMC`1#x$k܌0[k M2yi˝7~N--a6gBJNg}_tP*]`(zX}SWgs$ǐs 1w\xѰx p&^r^B=^ r+̹SA;6])9_2϶= (Xjg<-]gzowɼ&osɖm#MN)lkJ#IT%`uj=L9mvxTvfbC_b9K HO^uV,o? wOӧ4-꿪`.A2ξZKD?4Fϩ|p*_~E:S3~>>- _^c%KW QPQ"D(DH{ѠXAKRYǹ@ {tu**^ufEekqUgVGx ܍8?MLP oK! #a!jX,}R$>ϣ(%A*1Vh RcwhQ=|.kRR,%j*%(<E(FbAD35'$Z. t#$CW'TR(^Y}1ɖd6UOh2i˺ժJw}KP ڴ( H AFE^ڷQJ"F":o USboFqo}kE:BzqgTTϨ1>yq&$nĻ~qU5@Iy&,P[n>W?bhQ 5Mhv_o1jQ#)7o߆/߾y7ξ{;iw8[\,W9'WTi Ķ9xvξۯcle<3J,zOHOC/Xef9xn|CсL $o#7)RwHY3bTΰcwV)b˔ثS5}!е[ۗbHQ^HDHÛ O(i\/cHj*9rZ"qC+K>Pg(y1_qd$"$?YT-q@ğ.~F>B|$XwoK$D&0|' k|TSbO{ȆKq9ɵ"\.9##pF0f` ;,5đ$;7z֝L&ҼDn;7$OЯ"?p?x,\Y2\kQ.|~Y! {oQoi=w5[=4Ʈ r2nA]uďaM V+ԥdۺ4*$/I81S4F#)^\9]j27ɐHsc_bjʌ>5<Ɯ43ҮĽnh{J0'[&sOY|p%L&YJlRU+C9L0_p/i!*lcB4]Ÿ&bHY0F+ x/q{ýZr*a_iͅ!&KoL2÷ķُdwAawag=t9ר? "$mU |v@=ޫnpBFD7l<"g"ѬŹ-EDkpFݩfC.ڶ7V;0b*&ec)ߓ1)ߙO)c+ mV$jdЃ,K߮} gVnif薚ֆ 7 <R B`TjܔSXK9)N2)J:aAx.  T8a,Pqj)hUBуb j<Tl)L~ar$:SJA}dIJZJEf21X4?)֦P3NRzRJ3SH4JXP ')=j)Ӻt20<)Ӥ%NRzR!OJ9LRzR%%IJZJAI)ܧ ԺRpcRivJoZzܶ@rHZlo=S𥼚n7PmفOg(73'KUeP#jl/,=?beqsǻ{:6-veVNRߠ-}NR_q9D,&R?Η=SzՓsQwMf؃6Ѿc϶x2ڏ[c⹶KiKahuO1__̱%lӫl7v_8ٜ9D<^HeÈ2 F0t]Hl^.~,dOanho?,}lvCJ!nnV@k +׼^lϻnc_BE[nPh_7{s:mӈC`Fe`ˆuZFY{.x}uD>v6 7{53oD`keƭA<'6Fa}=h KZnRɚ%!" G׶2KTNJo$V眷$ZNފv V-X ˨"hd26Dd^ebrGySҢZuAݽg;wy$+X:Shp^XoDxfutxr--O:މ7qr4t c/A+k5wiQo!ҹAαVk6_C{2ȉFl$ziSUiZJ/w _t$diϢCKQYLZlI64mv?-:EXĵR('P~4]`EqGX|Ųlbl 6Xr19)13m*Q1pJ9vx^0ƣ$,틨T9hI-&vNJIVp'tN[s5 .jS+@9w4u;B#@T2Z s,%[! KϚ1"a> >j5 y 5KDi,)D|nPy50L'R:R,E]RԧF}:G|}s7KNO>(V֝6Y{6OXڡ^Xz~,r]!8X|Z!+h(z"{У [{P|30U9jD,  "g ϯt J9iX}kGl;d0`蝼{U_L߹a祝198 8U`6ֆT椃͋]z{ykrjr o6\J-3պ3TSijK?UWBy{ݘYBdv}%Zo:#'#imj1Zplŋbn;·o^٢$nyaj߭k.Voɷqw &55{Vl˲ҏJ^AY7a`u9E.* VTJΑ olPЖY!9 u''sG҉rd3T?+M)/~)tJ8 nd.CPֻj,H\Ps?eojc]\+{\؋qЊjFeGee^R@˯!/{~=ȉJ'[78SNMgcdC|= ذ;q:`fcz4=H1Q (U``=9+r=g%[/NLO b_=9AMWE{AL28G"Wx ɸKc"u/" p*J0cdi 9ﭳ%JPT`jIK{)·x܅var/home/core/zuul-output/logs/kubelet.log0000644000000000000000004671700215155101503017700 0ustar rootrootMar 13 20:27:47 crc systemd[1]: Starting Kubernetes Kubelet... Mar 13 20:27:47 crc restorecon[4688]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 20:27:47 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 13 20:27:48 crc restorecon[4688]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 20:27:49 crc kubenswrapper[4790]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.416921 4790 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422357 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422425 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422437 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422448 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422462 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422475 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422486 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422497 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422507 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422517 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422526 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422537 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422547 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422557 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422567 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422579 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422614 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422622 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422630 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422641 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422653 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422666 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422701 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422715 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422727 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422747 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422762 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422773 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422783 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422793 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422803 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422816 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422826 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422836 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422845 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422855 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422865 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422875 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422883 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422891 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422899 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422906 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422914 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422923 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422931 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422938 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422946 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422954 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422961 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422969 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422976 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.422984 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423010 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423019 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423030 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423037 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423045 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423053 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423061 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423068 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423076 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423083 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423091 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423105 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423115 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423125 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423135 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423145 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423155 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423165 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.423175 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424082 4790 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424116 4790 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424145 4790 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424161 4790 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424176 4790 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424185 4790 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424197 4790 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424208 4790 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424218 4790 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424227 4790 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424240 4790 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424249 4790 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424258 4790 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424267 4790 flags.go:64] FLAG: --cgroup-root="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424277 4790 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424286 4790 flags.go:64] FLAG: --client-ca-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424295 4790 flags.go:64] FLAG: --cloud-config="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424316 4790 flags.go:64] FLAG: --cloud-provider="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424325 4790 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424347 4790 flags.go:64] FLAG: --cluster-domain="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424355 4790 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424365 4790 flags.go:64] FLAG: --config-dir="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424373 4790 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424419 4790 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424431 4790 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424440 4790 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424450 4790 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424459 4790 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424470 4790 flags.go:64] FLAG: --contention-profiling="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424478 4790 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424488 4790 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424497 4790 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424506 4790 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424517 4790 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424526 4790 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424535 4790 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424544 4790 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424553 4790 flags.go:64] FLAG: --enable-server="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424562 4790 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424583 4790 flags.go:64] FLAG: --event-burst="100" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424593 4790 flags.go:64] FLAG: --event-qps="50" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424603 4790 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424616 4790 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424627 4790 flags.go:64] FLAG: --eviction-hard="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424641 4790 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424652 4790 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424663 4790 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424675 4790 flags.go:64] FLAG: --eviction-soft="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424686 4790 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424696 4790 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424706 4790 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424717 4790 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424728 4790 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424754 4790 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424764 4790 flags.go:64] FLAG: --feature-gates="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424775 4790 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424784 4790 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424794 4790 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424803 4790 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424812 4790 flags.go:64] FLAG: --healthz-port="10248" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424821 4790 flags.go:64] FLAG: --help="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424831 4790 flags.go:64] FLAG: --hostname-override="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424839 4790 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424848 4790 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424857 4790 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424866 4790 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424875 4790 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424884 4790 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424892 4790 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424901 4790 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424910 4790 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424920 4790 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424929 4790 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424939 4790 flags.go:64] FLAG: --kube-reserved="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424948 4790 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424956 4790 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424966 4790 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424975 4790 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424983 4790 flags.go:64] FLAG: --lock-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.424992 4790 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425001 4790 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425010 4790 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425027 4790 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425038 4790 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425049 4790 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425059 4790 flags.go:64] FLAG: --logging-format="text" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425071 4790 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425083 4790 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425093 4790 flags.go:64] FLAG: --manifest-url="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425115 4790 flags.go:64] FLAG: --manifest-url-header="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425127 4790 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425137 4790 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425148 4790 flags.go:64] FLAG: --max-pods="110" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425157 4790 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425166 4790 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425177 4790 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425188 4790 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425199 4790 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425212 4790 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425223 4790 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425250 4790 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425260 4790 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425273 4790 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425285 4790 flags.go:64] FLAG: --pod-cidr="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425296 4790 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425313 4790 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425323 4790 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425335 4790 flags.go:64] FLAG: --pods-per-core="0" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425346 4790 flags.go:64] FLAG: --port="10250" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425357 4790 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425368 4790 flags.go:64] FLAG: --provider-id="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425440 4790 flags.go:64] FLAG: --qos-reserved="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425454 4790 flags.go:64] FLAG: --read-only-port="10255" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425467 4790 flags.go:64] FLAG: --register-node="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425477 4790 flags.go:64] FLAG: --register-schedulable="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425487 4790 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425505 4790 flags.go:64] FLAG: --registry-burst="10" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425516 4790 flags.go:64] FLAG: --registry-qps="5" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425527 4790 flags.go:64] FLAG: --reserved-cpus="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425538 4790 flags.go:64] FLAG: --reserved-memory="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425552 4790 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425563 4790 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425575 4790 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425586 4790 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425598 4790 flags.go:64] FLAG: --runonce="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425627 4790 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425640 4790 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425652 4790 flags.go:64] FLAG: --seccomp-default="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425663 4790 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425675 4790 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425688 4790 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425700 4790 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425712 4790 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425723 4790 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425734 4790 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425745 4790 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425756 4790 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425768 4790 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425780 4790 flags.go:64] FLAG: --system-cgroups="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425793 4790 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425812 4790 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425823 4790 flags.go:64] FLAG: --tls-cert-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425834 4790 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425856 4790 flags.go:64] FLAG: --tls-min-version="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425868 4790 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425879 4790 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425890 4790 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425901 4790 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425913 4790 flags.go:64] FLAG: --v="2" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425928 4790 flags.go:64] FLAG: --version="false" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425942 4790 flags.go:64] FLAG: --vmodule="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425954 4790 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.425966 4790 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426265 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426282 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426295 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426305 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426316 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426329 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426341 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426353 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426420 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426434 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426445 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426455 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426463 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426472 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426480 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426488 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426496 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426503 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426511 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426519 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426527 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426535 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426543 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426551 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426558 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426566 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426574 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426582 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426590 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426599 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426608 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426618 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426636 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426652 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426662 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426721 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426731 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426741 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426751 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426761 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426771 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426781 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426791 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426801 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426815 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426825 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426836 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426846 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426856 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426873 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426892 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426904 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426918 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426930 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426942 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426953 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426963 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426973 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426983 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.426993 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427003 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427014 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427024 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427033 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427045 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427054 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427064 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427082 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427091 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427099 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.427107 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.427120 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.439243 4790 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.439282 4790 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439365 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439392 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439399 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439404 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439408 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439412 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439417 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439423 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439429 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439435 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439440 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439444 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439449 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439454 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439460 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439465 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439470 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439475 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439479 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439483 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439487 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439492 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439497 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439501 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439506 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439509 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439515 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439524 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439529 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439533 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439537 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439544 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439549 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439554 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439559 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439565 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439570 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439574 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439579 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439584 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439589 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439594 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439598 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439603 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439607 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439613 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439619 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439624 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439628 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439632 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439637 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439641 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439646 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439650 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439654 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439658 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439662 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439666 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439671 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439674 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439678 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439682 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439686 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439691 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439697 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439701 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439706 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439710 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439714 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439718 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439723 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.439731 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439925 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439935 4790 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439941 4790 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439947 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439952 4790 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439959 4790 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439964 4790 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439969 4790 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439973 4790 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439978 4790 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439982 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439988 4790 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439994 4790 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.439999 4790 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440004 4790 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440009 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440015 4790 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440021 4790 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440026 4790 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440033 4790 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440038 4790 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440043 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440048 4790 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440053 4790 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440058 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440063 4790 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440067 4790 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440072 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440076 4790 feature_gate.go:330] unrecognized feature gate: Example Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440080 4790 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440084 4790 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440088 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440091 4790 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440095 4790 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440099 4790 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440103 4790 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440107 4790 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440112 4790 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440115 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440119 4790 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440125 4790 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440129 4790 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440132 4790 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440137 4790 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440140 4790 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440144 4790 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440148 4790 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440151 4790 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440155 4790 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440159 4790 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440163 4790 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440167 4790 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440171 4790 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440176 4790 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440180 4790 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440184 4790 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440188 4790 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440192 4790 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440196 4790 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440200 4790 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440203 4790 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440208 4790 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440212 4790 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440216 4790 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440220 4790 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440224 4790 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440228 4790 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440232 4790 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440236 4790 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440240 4790 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.440244 4790 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.440250 4790 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.440488 4790 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.443741 4790 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.448758 4790 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.448933 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.451513 4790 server.go:997] "Starting client certificate rotation" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.451575 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.451990 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.474994 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.477567 4790 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.478784 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.493755 4790 log.go:25] "Validated CRI v1 runtime API" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.527496 4790 log.go:25] "Validated CRI v1 image API" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.531771 4790 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.537791 4790 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-13-20-23-01-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.537831 4790 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.561348 4790 manager.go:217] Machine: {Timestamp:2026-03-13 20:27:49.558223126 +0000 UTC m=+0.579339057 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654116352 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:e656ddb5-8fa2-4c70-bd3f-f718d29b7550 BootID:ddb77a45-6df3-4ccf-8361-682222076454 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108168 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827056128 Type:vfs Inodes:4108168 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827060224 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9f:1d:06 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9f:1d:06 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:4d:d1:84 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:4c:81:52 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:20:f9:ca Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:f0:0d:ac Speed:-1 Mtu:1496} {Name:eth10 MacAddress:52:65:0e:75:1e:5a Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:39:71:d8:37:c7 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654116352 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.561667 4790 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.561823 4790 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.566489 4790 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.566781 4790 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.566823 4790 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567139 4790 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567159 4790 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567788 4790 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.567836 4790 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.568047 4790 state_mem.go:36] "Initialized new in-memory state store" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.568625 4790 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577409 4790 kubelet.go:418] "Attempting to sync node with API server" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577443 4790 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577548 4790 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577575 4790 kubelet.go:324] "Adding apiserver pod source" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.577596 4790 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.583633 4790 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.584658 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.584655 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.584785 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.584787 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.585825 4790 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.587999 4790 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589721 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589749 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589759 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589768 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589781 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589790 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589802 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589819 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589829 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589838 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589880 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.589890 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.595519 4790 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.596051 4790 server.go:1280] "Started kubelet" Mar 13 20:27:49 crc systemd[1]: Started Kubernetes Kubelet. Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599433 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599794 4790 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599784 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599800 4790 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.600042 4790 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.600086 4790 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.599952 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.600128 4790 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.599820 4790 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.600318 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.607314 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.607470 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.607967 4790 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608623 4790 server.go:460] "Adding debug handlers to kubelet server" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608778 4790 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608829 4790 factory.go:55] Registering systemd factory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.608846 4790 factory.go:221] Registration of the systemd container factory successfully Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609399 4790 factory.go:153] Registering CRI-O factory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609433 4790 factory.go:221] Registration of the crio container factory successfully Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609466 4790 factory.go:103] Registering Raw factory Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.609487 4790 manager.go:1196] Started watching for new ooms in manager Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.608419 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.610984 4790 manager.go:319] Starting recovery of all containers Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623126 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623232 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623267 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623333 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623352 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623371 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623416 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623435 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623458 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623530 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623553 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623572 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623589 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623685 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623735 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623766 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623790 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623813 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623831 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623852 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623870 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623888 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623907 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623925 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623963 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.623980 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624003 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624023 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624041 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624058 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624110 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624137 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624155 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624172 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624189 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.624207 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628332 4790 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628435 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628461 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628479 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628494 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628509 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628522 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628539 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628554 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628571 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628585 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628600 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628616 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628631 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628647 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628661 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628674 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628697 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628711 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628726 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628742 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628757 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628772 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628786 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628800 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628814 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628828 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628846 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628860 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628876 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628889 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628905 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628923 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628939 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628952 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628964 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628976 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.628988 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629003 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629015 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629026 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629042 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629056 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629069 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629085 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629098 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629111 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629124 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629139 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629151 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629163 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629201 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629212 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629223 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629234 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629260 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629272 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629284 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629297 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629314 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629326 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629340 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629354 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629367 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629400 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629416 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629428 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629487 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629501 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629522 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629537 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629551 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629570 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629583 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629597 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629612 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629626 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629641 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629653 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629665 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629678 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629690 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629704 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629716 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629730 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629749 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629762 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629774 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629786 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629797 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629808 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629822 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629834 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629847 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629857 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629869 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629882 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629895 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629913 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629924 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629937 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629949 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629963 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629974 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.629986 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630002 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630014 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630027 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630039 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630051 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630064 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630078 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630091 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630103 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630115 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630127 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630142 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630155 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630170 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630185 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630201 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630214 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630226 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630240 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630250 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630262 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630274 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630283 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630293 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630304 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630315 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630325 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630335 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630359 4790 manager.go:324] Recovery completed Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630399 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630549 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630599 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630609 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630628 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630637 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630648 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630658 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630668 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630679 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630691 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630702 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630713 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630724 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630737 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630752 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630764 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630776 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630790 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630802 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630815 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630828 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630840 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630852 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630865 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630876 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630887 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630898 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630909 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630920 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630931 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630941 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630953 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630963 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630973 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630983 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.630994 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631004 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631017 4790 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631026 4790 reconstruct.go:97] "Volume reconstruction finished" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.631033 4790 reconciler.go:26] "Reconciler: start to sync state" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.638838 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.640741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.640786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.640798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.644347 4790 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.644367 4790 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.644409 4790 state_mem.go:36] "Initialized new in-memory state store" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.656535 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.658487 4790 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.658572 4790 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.658612 4790 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.658668 4790 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 20:27:49 crc kubenswrapper[4790]: W0313 20:27:49.659449 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.659538 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.662175 4790 policy_none.go:49] "None policy: Start" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.663092 4790 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.663128 4790 state_mem.go:35] "Initializing new in-memory state store" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.700656 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.708512 4790 manager.go:334] "Starting Device Plugin manager" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.708737 4790 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.708761 4790 server.go:79] "Starting device plugin registration server" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709188 4790 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709210 4790 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709385 4790 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709564 4790 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.709574 4790 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.715848 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.758790 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.758954 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.761721 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762823 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762860 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.762928 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763202 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763282 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763439 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.763472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764305 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.764972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765010 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765123 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765273 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765734 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765828 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765874 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765899 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.765908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766023 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766085 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766326 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766401 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766626 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766970 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.766992 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767126 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.767162 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.801755 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.809945 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811299 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811333 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811345 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.811391 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:49 crc kubenswrapper[4790]: E0313 20:27:49.811844 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832827 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832853 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.832942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833105 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833174 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833199 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833240 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.833328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935302 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935346 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935406 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935507 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935590 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935624 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935680 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935740 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935860 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.935960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.936020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.936021 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:49 crc kubenswrapper[4790]: I0313 20:27:49.936120 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.012475 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014389 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.014426 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.015260 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.110030 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.130822 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.139275 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.157343 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4 WatchSource:0}: Error finding container 6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4: Status 404 returned error can't find the container with id 6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4 Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.158550 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.163558 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa WatchSource:0}: Error finding container 010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa: Status 404 returned error can't find the container with id 010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.164146 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174 WatchSource:0}: Error finding container b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174: Status 404 returned error can't find the container with id b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174 Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.164828 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.172205 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152 WatchSource:0}: Error finding container 988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152: Status 404 returned error can't find the container with id 988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152 Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.202483 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.426437 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.428770 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.429419 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.598400 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.598514 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.601094 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.663870 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"7eb1cafcefd5398e40c79482db9ff3626d16ce0f27e093e72f6093252fb76e4e"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.664987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"988c06ac10c8e5bcf9204100e0690808988ea062d7a1a5a82579e09239738152"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.666046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"010bd628c531d781ceb414f350a323538a15cd43329e078f60885e7765743afa"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.666954 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b99becb8c2df3ea3f462111e3cc78ebe053d9b079152a40d328de29590dec174"} Mar 13 20:27:50 crc kubenswrapper[4790]: I0313 20:27:50.667899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6c59e8ba7e5c77c197ae0dff4d51e944e20594cf06c492e42268f17ead17a4b4"} Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.704223 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.704328 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.897272 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.897446 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:50 crc kubenswrapper[4790]: W0313 20:27:50.919904 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:50 crc kubenswrapper[4790]: E0313 20:27:50.920022 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:51 crc kubenswrapper[4790]: E0313 20:27:51.003328 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.229807 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232426 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.232480 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:51 crc kubenswrapper[4790]: E0313 20:27:51.233466 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.601670 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.673092 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.674076 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.673858 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.673674 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b"} Mar 13 20:27:51 crc kubenswrapper[4790]: E0313 20:27:51.675948 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.677874 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.677947 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.677985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.679784 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.679884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.680018 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.680443 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.681677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.681708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.681718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.682679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.682722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.682740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.684475 4790 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.684522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.684612 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.686374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.686631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.686856 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.689734 4790 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5" exitCode=0 Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.689795 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.689901 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.690968 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.691018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.691037 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698006 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698099 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698144 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe"} Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.698353 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.703337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.703419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:51 crc kubenswrapper[4790]: I0313 20:27:51.703441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.601959 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.604578 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="3.2s" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.699360 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708203 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.708291 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.712599 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d" exitCode=0 Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.712679 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.712869 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.714042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.714085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.714098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.716528 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.716522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.717312 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.717348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.717360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720525 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938"} Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720545 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.720573 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721636 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.721841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.815641 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.833807 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:52 crc kubenswrapper[4790]: I0313 20:27:52.835534 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.836258 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.143:6443: connect: connection refused" node="crc" Mar 13 20:27:52 crc kubenswrapper[4790]: W0313 20:27:52.891915 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.892021 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:52 crc kubenswrapper[4790]: W0313 20:27:52.907238 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.143:6443: connect: connection refused Mar 13 20:27:52 crc kubenswrapper[4790]: E0313 20:27:52.907341 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.143:6443: connect: connection refused" logger="UnhandledError" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.363313 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.660583 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.726711 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0"} Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.726830 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.732679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.732779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.732796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.734829 4790 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364" exitCode=0 Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.734935 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.734977 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.735002 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.735014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364"} Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.735064 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736126 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736312 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736345 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736529 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736612 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736677 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:53 crc kubenswrapper[4790]: I0313 20:27:53.736835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.213209 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.313500 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742277 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da"} Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742298 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742236 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.742277 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743117 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743492 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743610 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.743625 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.745226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.745258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:54 crc kubenswrapper[4790]: I0313 20:27:54.745268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749193 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a"} Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749307 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749329 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.749979 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750432 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750653 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.750682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.751244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.751260 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.751266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.875644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 13 20:27:55 crc kubenswrapper[4790]: I0313 20:27:55.963118 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.037091 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038593 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.038625 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.363594 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.363770 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.756925 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.758071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.758130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:56 crc kubenswrapper[4790]: I0313 20:27:56.758142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.759574 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.760472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.760506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:57 crc kubenswrapper[4790]: I0313 20:27:57.760517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.418960 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.419136 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.420542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.420774 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:27:58 crc kubenswrapper[4790]: I0313 20:27:58.420899 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:27:59 crc kubenswrapper[4790]: E0313 20:27:59.716090 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.230949 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.231229 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.233113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.233193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.233232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.236442 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.768926 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.770600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.770661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.770686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:00 crc kubenswrapper[4790]: I0313 20:28:00.774064 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.771447 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.772821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.772872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:01 crc kubenswrapper[4790]: I0313 20:28:01.772884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:02 crc kubenswrapper[4790]: I0313 20:28:02.802353 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 13 20:28:02 crc kubenswrapper[4790]: I0313 20:28:02.802497 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.601872 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.661567 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.661686 4790 trace.go:236] Trace[542972085]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (13-Mar-2026 20:27:53.660) (total time: 10001ms): Mar 13 20:28:03 crc kubenswrapper[4790]: Trace[542972085]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (20:28:03.661) Mar 13 20:28:03 crc kubenswrapper[4790]: Trace[542972085]: [10.00115726s] [10.00115726s] END Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.661718 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.861706 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.873570 4790 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.877412 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.878691 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.878750 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.878832 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.878910 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.879503 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.883281 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.883396 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.884329 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:03 crc kubenswrapper[4790]: I0313 20:28:03.884427 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 13 20:28:03 crc kubenswrapper[4790]: W0313 20:28:03.885267 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z Mar 13 20:28:03 crc kubenswrapper[4790]: E0313 20:28:03.885352 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.219027 4790 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]log ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]etcd ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-filter ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-informers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-controllers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/crd-informer-synced ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-system-namespaces-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 13 20:28:04 crc kubenswrapper[4790]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/bootstrap-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/start-kube-aggregator-informers ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-registration-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-discovery-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]autoregister-completion ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapi-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 13 20:28:04 crc kubenswrapper[4790]: livez check failed Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.219120 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.605016 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:04Z is after 2026-02-23T05:33:13Z Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.781564 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.783287 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0" exitCode=255 Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.783338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0"} Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.783512 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.784425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.784472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.784482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:04 crc kubenswrapper[4790]: I0313 20:28:04.785086 4790 scope.go:117] "RemoveContainer" containerID="6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.403346 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.403730 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.405549 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.405595 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.405609 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.433975 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.605864 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:05Z is after 2026-02-23T05:33:13Z Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.790574 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.792758 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41"} Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.792908 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.792992 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.793750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.793778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.793791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.794254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.794282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.794294 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:05 crc kubenswrapper[4790]: I0313 20:28:05.810363 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.364977 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.365087 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.604426 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:06Z is after 2026-02-23T05:33:13Z Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.797688 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.798679 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801044 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" exitCode=255 Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41"} Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801245 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801317 4790 scope.go:117] "RemoveContainer" containerID="6a61f22a2153f3d473dcd3aee424a407db7b0fe6864d02f4c01c31829aad7ed0" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.801453 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.802084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.802113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.802128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803149 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803161 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:06 crc kubenswrapper[4790]: I0313 20:28:06.803750 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:06 crc kubenswrapper[4790]: E0313 20:28:06.803923 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.605732 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:07Z is after 2026-02-23T05:33:13Z Mar 13 20:28:07 crc kubenswrapper[4790]: W0313 20:28:07.620972 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:07Z is after 2026-02-23T05:33:13Z Mar 13 20:28:07 crc kubenswrapper[4790]: E0313 20:28:07.621259 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.778890 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.804661 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.806679 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.807635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.807671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.807681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:07 crc kubenswrapper[4790]: I0313 20:28:07.808191 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:07 crc kubenswrapper[4790]: E0313 20:28:07.808427 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:08 crc kubenswrapper[4790]: W0313 20:28:08.152848 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:08Z is after 2026-02-23T05:33:13Z Mar 13 20:28:08 crc kubenswrapper[4790]: E0313 20:28:08.152943 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 13 20:28:08 crc kubenswrapper[4790]: I0313 20:28:08.603934 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:08Z is after 2026-02-23T05:33:13Z Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.225650 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.225857 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.227441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.227539 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.227566 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.228808 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:09 crc kubenswrapper[4790]: E0313 20:28:09.229093 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.241910 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.603037 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:09Z is after 2026-02-23T05:33:13Z Mar 13 20:28:09 crc kubenswrapper[4790]: E0313 20:28:09.716187 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.812280 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.813477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.813608 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.813754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:09 crc kubenswrapper[4790]: I0313 20:28:09.814673 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:09 crc kubenswrapper[4790]: E0313 20:28:09.815008 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.280093 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:10 crc kubenswrapper[4790]: E0313 20:28:10.281434 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:10Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.281976 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:10 crc kubenswrapper[4790]: E0313 20:28:10.284810 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:28:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 13 20:28:10 crc kubenswrapper[4790]: W0313 20:28:10.521534 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 20:28:10 crc kubenswrapper[4790]: E0313 20:28:10.521609 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:10 crc kubenswrapper[4790]: I0313 20:28:10.605110 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:11 crc kubenswrapper[4790]: I0313 20:28:11.609076 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.605137 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.646176 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.664932 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.801163 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.801518 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.803537 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.803611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.803637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:12 crc kubenswrapper[4790]: I0313 20:28:12.804633 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:12 crc kubenswrapper[4790]: E0313 20:28:12.805000 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:13 crc kubenswrapper[4790]: I0313 20:28:13.611061 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.870583 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c4fcc930 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,LastTimestamp:2026-03-13 20:27:49.596006704 +0000 UTC m=+0.617122595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.877614 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.884610 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.891833 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.899058 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086cbde6881 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.711456385 +0000 UTC m=+0.732572276,LastTimestamp:2026-03-13 20:27:49.711456385 +0000 UTC m=+0.732572276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.907097 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.76148253 +0000 UTC m=+0.782598421,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.914206 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.761506491 +0000 UTC m=+0.782622382,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.921182 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.761518122 +0000 UTC m=+0.782634013,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.926189 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.762744317 +0000 UTC m=+0.783860208,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.933691 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.762773659 +0000 UTC m=+0.783889550,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.939156 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.762783159 +0000 UTC m=+0.783899050,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.946690 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.763454249 +0000 UTC m=+0.784570130,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.953909 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.76346871 +0000 UTC m=+0.784584601,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.960199 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.76347707 +0000 UTC m=+0.784592961,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.967049 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.764287418 +0000 UTC m=+0.785403309,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.972528 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.764301768 +0000 UTC m=+0.785417659,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.976649 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.764311739 +0000 UTC m=+0.785427630,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.981230 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.76499689 +0000 UTC m=+0.786112781,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.986497 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.765017041 +0000 UTC m=+0.786132932,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.991447 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.765027531 +0000 UTC m=+0.786143432,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:13 crc kubenswrapper[4790]: E0313 20:28:13.998782 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.765724283 +0000 UTC m=+0.786840174,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.005728 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.765740313 +0000 UTC m=+0.786856204,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.010588 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a8560b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a8560b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640803851 +0000 UTC m=+0.661919742,LastTimestamp:2026-03-13 20:27:49.765751034 +0000 UTC m=+0.786866925,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.014674 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a7d19e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a7d19e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.64076995 +0000 UTC m=+0.661885841,LastTimestamp:2026-03-13 20:27:49.76588529 +0000 UTC m=+0.787001181,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.019053 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189c8086c7a82a91\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189c8086c7a82a91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:49.640792721 +0000 UTC m=+0.661908612,LastTimestamp:2026-03-13 20:27:49.765904731 +0000 UTC m=+0.787020622,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.028025 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8086e6e80380 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.16507072 +0000 UTC m=+1.186186611,LastTimestamp:2026-03-13 20:27:50.16507072 +0000 UTC m=+1.186186611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.033884 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8086e7061516 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.167041302 +0000 UTC m=+1.188157193,LastTimestamp:2026-03-13 20:27:50.167041302 +0000 UTC m=+1.188157193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.040983 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8086e70de4e8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.167553256 +0000 UTC m=+1.188669177,LastTimestamp:2026-03-13 20:27:50.167553256 +0000 UTC m=+1.188669177,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.045729 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c8086e7c760b0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.179709104 +0000 UTC m=+1.200824995,LastTimestamp:2026-03-13 20:27:50.179709104 +0000 UTC m=+1.200824995,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.051533 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c8086e82a08b7 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.186174647 +0000 UTC m=+1.207290538,LastTimestamp:2026-03-13 20:27:50.186174647 +0000 UTC m=+1.207290538,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.057532 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80870b739e4f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.778199631 +0000 UTC m=+1.799315552,LastTimestamp:2026-03-13 20:27:50.778199631 +0000 UTC m=+1.799315552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.064667 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80870b85f55c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.779401564 +0000 UTC m=+1.800517455,LastTimestamp:2026-03-13 20:27:50.779401564 +0000 UTC m=+1.800517455,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.071570 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870b91ae38 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.780169784 +0000 UTC m=+1.801285715,LastTimestamp:2026-03-13 20:27:50.780169784 +0000 UTC m=+1.801285715,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.077792 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80870ba3c3f5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.781354997 +0000 UTC m=+1.802470888,LastTimestamp:2026-03-13 20:27:50.781354997 +0000 UTC m=+1.802470888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.081938 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c80870be1b5ce openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.785414606 +0000 UTC m=+1.806530507,LastTimestamp:2026-03-13 20:27:50.785414606 +0000 UTC m=+1.806530507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.087862 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870c6cbc73 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.794525811 +0000 UTC m=+1.815641702,LastTimestamp:2026-03-13 20:27:50.794525811 +0000 UTC m=+1.815641702,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.091782 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80870c83465f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796002911 +0000 UTC m=+1.817118802,LastTimestamp:2026-03-13 20:27:50.796002911 +0000 UTC m=+1.817118802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.095432 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80870c836671 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796011121 +0000 UTC m=+1.817127012,LastTimestamp:2026-03-13 20:27:50.796011121 +0000 UTC m=+1.817127012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.099022 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870c8d5459 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796661849 +0000 UTC m=+1.817777730,LastTimestamp:2026-03-13 20:27:50.796661849 +0000 UTC m=+1.817777730,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.102218 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80870cc19171 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.800085361 +0000 UTC m=+1.821201252,LastTimestamp:2026-03-13 20:27:50.800085361 +0000 UTC m=+1.821201252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.105388 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c80870cf404c0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.80339168 +0000 UTC m=+1.824507571,LastTimestamp:2026-03-13 20:27:50.80339168 +0000 UTC m=+1.824507571,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.108874 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872150a61c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.14500662 +0000 UTC m=+2.166122511,LastTimestamp:2026-03-13 20:27:51.14500662 +0000 UTC m=+2.166122511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.112700 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8087220437d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.156774867 +0000 UTC m=+2.177890758,LastTimestamp:2026-03-13 20:27:51.156774867 +0000 UTC m=+2.177890758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.116277 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8087221a6cdc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.158230236 +0000 UTC m=+2.179346127,LastTimestamp:2026-03-13 20:27:51.158230236 +0000 UTC m=+2.179346127,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.119810 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872e61cead openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.364234925 +0000 UTC m=+2.385350816,LastTimestamp:2026-03-13 20:27:51.364234925 +0000 UTC m=+2.385350816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.123212 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872f1690f7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.376081143 +0000 UTC m=+2.397197034,LastTimestamp:2026-03-13 20:27:51.376081143 +0000 UTC m=+2.397197034,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.126947 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872f26d3c6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.377146822 +0000 UTC m=+2.398262713,LastTimestamp:2026-03-13 20:27:51.377146822 +0000 UTC m=+2.398262713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.131099 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808738fe00d2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.542243538 +0000 UTC m=+2.563359429,LastTimestamp:2026-03-13 20:27:51.542243538 +0000 UTC m=+2.563359429,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.134558 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808739a645a5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.553271205 +0000 UTC m=+2.574387096,LastTimestamp:2026-03-13 20:27:51.553271205 +0000 UTC m=+2.574387096,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.140933 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808741344446 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.680017478 +0000 UTC m=+2.701133369,LastTimestamp:2026-03-13 20:27:51.680017478 +0000 UTC m=+2.701133369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.145625 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80874166509d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.683297437 +0000 UTC m=+2.704413378,LastTimestamp:2026-03-13 20:27:51.683297437 +0000 UTC m=+2.704413378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.149522 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c808741d1b8dd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.690336477 +0000 UTC m=+2.711452428,LastTimestamp:2026-03-13 20:27:51.690336477 +0000 UTC m=+2.711452428,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.154085 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c8087424d48ed openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.698434285 +0000 UTC m=+2.719550206,LastTimestamp:2026-03-13 20:27:51.698434285 +0000 UTC m=+2.719550206,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.159702 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087509c98da openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.938513114 +0000 UTC m=+2.959629005,LastTimestamp:2026-03-13 20:27:51.938513114 +0000 UTC m=+2.959629005,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.165062 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c8087509d5055 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.938560085 +0000 UTC m=+2.959675976,LastTimestamp:2026-03-13 20:27:51.938560085 +0000 UTC m=+2.959675976,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.171916 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808750ad742b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.939617835 +0000 UTC m=+2.960733726,LastTimestamp:2026-03-13 20:27:51.939617835 +0000 UTC m=+2.960733726,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.178855 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c808750cf12f9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.941821177 +0000 UTC m=+2.962937068,LastTimestamp:2026-03-13 20:27:51.941821177 +0000 UTC m=+2.962937068,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.183519 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087517e45c4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.95330298 +0000 UTC m=+2.974418861,LastTimestamp:2026-03-13 20:27:51.95330298 +0000 UTC m=+2.974418861,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.188271 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189c8087519453b4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.95474834 +0000 UTC m=+2.975864231,LastTimestamp:2026-03-13 20:27:51.95474834 +0000 UTC m=+2.975864231,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.194527 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80875195e73b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.954851643 +0000 UTC m=+2.975967534,LastTimestamp:2026-03-13 20:27:51.954851643 +0000 UTC m=+2.975967534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.199244 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087519852d2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.955010258 +0000 UTC m=+2.976126149,LastTimestamp:2026-03-13 20:27:51.955010258 +0000 UTC m=+2.976126149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.204093 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c808751ac1192 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.956304274 +0000 UTC m=+2.977420165,LastTimestamp:2026-03-13 20:27:51.956304274 +0000 UTC m=+2.977420165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.211431 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c808751db23fd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.959389181 +0000 UTC m=+2.980505072,LastTimestamp:2026-03-13 20:27:51.959389181 +0000 UTC m=+2.980505072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.216210 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80875f9f0dc7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.190332359 +0000 UTC m=+3.211448250,LastTimestamp:2026-03-13 20:27:52.190332359 +0000 UTC m=+3.211448250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.221038 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80875fc6bb40 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.192932672 +0000 UTC m=+3.214048563,LastTimestamp:2026-03-13 20:27:52.192932672 +0000 UTC m=+3.214048563,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.225476 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80876095a90b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.206493963 +0000 UTC m=+3.227609854,LastTimestamp:2026-03-13 20:27:52.206493963 +0000 UTC m=+3.227609854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.230095 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c808760acd19d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.208011677 +0000 UTC m=+3.229127568,LastTimestamp:2026-03-13 20:27:52.208011677 +0000 UTC m=+3.229127568,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.234195 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808760cba7c8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.210032584 +0000 UTC m=+3.231148475,LastTimestamp:2026-03-13 20:27:52.210032584 +0000 UTC m=+3.231148475,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.239702 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808760e3481d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.211580957 +0000 UTC m=+3.232696848,LastTimestamp:2026-03-13 20:27:52.211580957 +0000 UTC m=+3.232696848,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.244848 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80876b87d9a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.390138281 +0000 UTC m=+3.411254172,LastTimestamp:2026-03-13 20:27:52.390138281 +0000 UTC m=+3.411254172,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.249067 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80876ba03eb7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.391737015 +0000 UTC m=+3.412852926,LastTimestamp:2026-03-13 20:27:52.391737015 +0000 UTC m=+3.412852926,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.253436 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80876c76e567 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.405804391 +0000 UTC m=+3.426920282,LastTimestamp:2026-03-13 20:27:52.405804391 +0000 UTC m=+3.426920282,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.257731 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c80876c90a7c5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.407492549 +0000 UTC m=+3.428608440,LastTimestamp:2026-03-13 20:27:52.407492549 +0000 UTC m=+3.428608440,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.263874 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189c80876ca6ab0c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.40893518 +0000 UTC m=+3.430051071,LastTimestamp:2026-03-13 20:27:52.40893518 +0000 UTC m=+3.430051071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.271642 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808776ce152c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.579290412 +0000 UTC m=+3.600406303,LastTimestamp:2026-03-13 20:27:52.579290412 +0000 UTC m=+3.600406303,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.278046 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087779dd855 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.592906325 +0000 UTC m=+3.614022216,LastTimestamp:2026-03-13 20:27:52.592906325 +0000 UTC m=+3.614022216,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.284952 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808777afd2bb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.594084539 +0000 UTC m=+3.615200420,LastTimestamp:2026-03-13 20:27:52.594084539 +0000 UTC m=+3.615200420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.291668 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80877eeafb6c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.715402092 +0000 UTC m=+3.736517983,LastTimestamp:2026-03-13 20:27:52.715402092 +0000 UTC m=+3.736517983,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.299055 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808783683d07 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.790719751 +0000 UTC m=+3.811835642,LastTimestamp:2026-03-13 20:27:52.790719751 +0000 UTC m=+3.811835642,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.302962 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8087843b66ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.804558591 +0000 UTC m=+3.825674482,LastTimestamp:2026-03-13 20:27:52.804558591 +0000 UTC m=+3.825674482,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.307910 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80878b2e9bd5 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.921160661 +0000 UTC m=+3.942276552,LastTimestamp:2026-03-13 20:27:52.921160661 +0000 UTC m=+3.942276552,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.311789 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c80878c022ec3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:52.935026371 +0000 UTC m=+3.956142252,LastTimestamp:2026-03-13 20:27:52.935026371 +0000 UTC m=+3.956142252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.316238 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087bbd7f58e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:53.737565582 +0000 UTC m=+4.758681473,LastTimestamp:2026-03-13 20:27:53.737565582 +0000 UTC m=+4.758681473,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.319529 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087cdc9a530 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.038617392 +0000 UTC m=+5.059733283,LastTimestamp:2026-03-13 20:27:54.038617392 +0000 UTC m=+5.059733283,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.323432 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ce56a156 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.047856982 +0000 UTC m=+5.068972883,LastTimestamp:2026-03-13 20:27:54.047856982 +0000 UTC m=+5.068972883,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.329014 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ce70a248 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.04956116 +0000 UTC m=+5.070677081,LastTimestamp:2026-03-13 20:27:54.04956116 +0000 UTC m=+5.070677081,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.332819 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087d7f9ab7b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.209536891 +0000 UTC m=+5.230652782,LastTimestamp:2026-03-13 20:27:54.209536891 +0000 UTC m=+5.230652782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.336814 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087d895404c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.219733068 +0000 UTC m=+5.240848959,LastTimestamp:2026-03-13 20:27:54.219733068 +0000 UTC m=+5.240848959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.340831 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087d8a8ceb9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.221014713 +0000 UTC m=+5.242130604,LastTimestamp:2026-03-13 20:27:54.221014713 +0000 UTC m=+5.242130604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.345025 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087e273295f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.385271135 +0000 UTC m=+5.406387026,LastTimestamp:2026-03-13 20:27:54.385271135 +0000 UTC m=+5.406387026,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.349737 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087e37166ea openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.401933034 +0000 UTC m=+5.423048935,LastTimestamp:2026-03-13 20:27:54.401933034 +0000 UTC m=+5.423048935,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.354818 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087e387fdd0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.403413456 +0000 UTC m=+5.424529347,LastTimestamp:2026-03-13 20:27:54.403413456 +0000 UTC m=+5.424529347,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.359074 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ed5e8061 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.568466529 +0000 UTC m=+5.589582410,LastTimestamp:2026-03-13 20:27:54.568466529 +0000 UTC m=+5.589582410,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.362870 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087ede07770 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.57698392 +0000 UTC m=+5.598099811,LastTimestamp:2026-03-13 20:27:54.57698392 +0000 UTC m=+5.598099811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.367272 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087edff68a2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.579011746 +0000 UTC m=+5.600127637,LastTimestamp:2026-03-13 20:27:54.579011746 +0000 UTC m=+5.600127637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.372214 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087f9db8a97 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.777987735 +0000 UTC m=+5.799103626,LastTimestamp:2026-03-13 20:27:54.777987735 +0000 UTC m=+5.799103626,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.376697 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189c8087fa8c9aa9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:54.789591721 +0000 UTC m=+5.810707622,LastTimestamp:2026-03-13 20:27:54.789591721 +0000 UTC m=+5.810707622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.381737 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c8088585fde92 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:14 crc kubenswrapper[4790]: body: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:56.36371829 +0000 UTC m=+7.384834271,LastTimestamp:2026-03-13 20:27:56.36371829 +0000 UTC m=+7.384834271,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.387887 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808858623e0c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:56.363873804 +0000 UTC m=+7.384989735,LastTimestamp:2026-03-13 20:27:56.363873804 +0000 UTC m=+7.384989735,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.395074 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c8089d8276554 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 13 20:28:14 crc kubenswrapper[4790]: body: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:02.80246818 +0000 UTC m=+13.823584161,LastTimestamp:2026-03-13 20:28:02.80246818 +0000 UTC m=+13.823584161,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.400454 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c8089d828b8e8 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:02.802555112 +0000 UTC m=+13.823671043,LastTimestamp:2026-03-13 20:28:02.802555112 +0000 UTC m=+13.823671043,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.405137 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c808a184ddd5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 20:28:14 crc kubenswrapper[4790]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:14 crc kubenswrapper[4790]: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878731101 +0000 UTC m=+14.899846992,LastTimestamp:2026-03-13 20:28:03.878731101 +0000 UTC m=+14.899846992,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.410007 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808a184e923b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878777403 +0000 UTC m=+14.899893294,LastTimestamp:2026-03-13 20:28:03.878777403 +0000 UTC m=+14.899893294,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.415641 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c808a184ddd5d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c808a184ddd5d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 13 20:28:14 crc kubenswrapper[4790]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 13 20:28:14 crc kubenswrapper[4790]: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878731101 +0000 UTC m=+14.899846992,LastTimestamp:2026-03-13 20:28:03.884397961 +0000 UTC m=+14.905513852,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.420421 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189c808a184e923b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189c808a184e923b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:03.878777403 +0000 UTC m=+14.899893294,LastTimestamp:2026-03-13 20:28:03.884458882 +0000 UTC m=+14.905574773,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.425179 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-apiserver-crc.189c808a2c976de3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 500 Mar 13 20:28:14 crc kubenswrapper[4790]: body: [+]ping ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]log ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]etcd ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-filter ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-informers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-apiextensions-controllers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/crd-informer-synced ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-system-namespaces-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 13 20:28:14 crc kubenswrapper[4790]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/bootstrap-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/start-kube-aggregator-informers ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-registration-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-discovery-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]autoregister-completion ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapi-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 13 20:28:14 crc kubenswrapper[4790]: livez check failed Mar 13 20:28:14 crc kubenswrapper[4790]: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:04.219096547 +0000 UTC m=+15.240212468,LastTimestamp:2026-03-13 20:28:04.219096547 +0000 UTC m=+15.240212468,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.431688 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:14 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c808aac800c25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:14 crc kubenswrapper[4790]: body: Mar 13 20:28:14 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,LastTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:14 crc kubenswrapper[4790]: > Mar 13 20:28:14 crc kubenswrapper[4790]: E0313 20:28:14.436847 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808aac8164d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,LastTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:14 crc kubenswrapper[4790]: I0313 20:28:14.605037 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:15 crc kubenswrapper[4790]: W0313 20:28:15.475962 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 20:28:15 crc kubenswrapper[4790]: E0313 20:28:15.476024 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:15 crc kubenswrapper[4790]: I0313 20:28:15.605784 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364002 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364140 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364215 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.364486 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366789 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.366982 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84" gracePeriod=30 Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.369501 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac800c25\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:16 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c808aac800c25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:16 crc kubenswrapper[4790]: body: Mar 13 20:28:16 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,LastTimestamp:2026-03-13 20:28:16.364110168 +0000 UTC m=+27.385226099,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:16 crc kubenswrapper[4790]: > Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.375670 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac8164d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808aac8164d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,LastTimestamp:2026-03-13 20:28:16.36417783 +0000 UTC m=+27.385293761,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.382291 4790 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808d00a91ed2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:16.366960338 +0000 UTC m=+27.388076229,LastTimestamp:2026-03-13 20:28:16.366960338 +0000 UTC m=+27.388076229,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.496767 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c80870c8d5459\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80870c8d5459 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:50.796661849 +0000 UTC m=+1.817777730,LastTimestamp:2026-03-13 20:28:16.490287908 +0000 UTC m=+27.511403799,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.607701 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.713271 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c80872150a61c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c80872150a61c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.14500662 +0000 UTC m=+2.166122511,LastTimestamp:2026-03-13 20:28:16.706123061 +0000 UTC m=+27.727238952,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.728107 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c8087220437d3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c8087220437d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:27:51.156774867 +0000 UTC m=+2.177890758,LastTimestamp:2026-03-13 20:28:16.720622558 +0000 UTC m=+27.741738449,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:16 crc kubenswrapper[4790]: W0313 20:28:16.787799 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 20:28:16 crc kubenswrapper[4790]: E0313 20:28:16.787915 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.837077 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.837616 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84" exitCode=255 Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.837683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84"} Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.838014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34"} Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.838175 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.839156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.839211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:16 crc kubenswrapper[4790]: I0313 20:28:16.839227 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.285422 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.287853 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:17 crc kubenswrapper[4790]: E0313 20:28:17.289678 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:17 crc kubenswrapper[4790]: E0313 20:28:17.289818 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.606017 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.840908 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.842257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.842321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:17 crc kubenswrapper[4790]: I0313 20:28:17.842335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:18 crc kubenswrapper[4790]: I0313 20:28:18.608249 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:19 crc kubenswrapper[4790]: I0313 20:28:19.607494 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:19 crc kubenswrapper[4790]: E0313 20:28:19.716532 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:20 crc kubenswrapper[4790]: W0313 20:28:20.221221 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:20 crc kubenswrapper[4790]: E0313 20:28:20.221595 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:20 crc kubenswrapper[4790]: I0313 20:28:20.606511 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:21 crc kubenswrapper[4790]: I0313 20:28:21.607044 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:22 crc kubenswrapper[4790]: I0313 20:28:22.607867 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.363919 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.364225 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.366091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.366178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.366208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:23 crc kubenswrapper[4790]: I0313 20:28:23.606918 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.290253 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.291990 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.292071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.292099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.292192 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:24 crc kubenswrapper[4790]: E0313 20:28:24.296319 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:24 crc kubenswrapper[4790]: E0313 20:28:24.296935 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.313955 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.314229 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.315645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.315708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.315726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:24 crc kubenswrapper[4790]: I0313 20:28:24.606539 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:25 crc kubenswrapper[4790]: I0313 20:28:25.605942 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.365005 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.365350 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:28:26 crc kubenswrapper[4790]: E0313 20:28:26.372442 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac800c25\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 13 20:28:26 crc kubenswrapper[4790]: &Event{ObjectMeta:{kube-controller-manager-crc.189c808aac800c25 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 13 20:28:26 crc kubenswrapper[4790]: body: Mar 13 20:28:26 crc kubenswrapper[4790]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365047845 +0000 UTC m=+17.386163776,LastTimestamp:2026-03-13 20:28:26.36532605 +0000 UTC m=+37.386441981,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 13 20:28:26 crc kubenswrapper[4790]: > Mar 13 20:28:26 crc kubenswrapper[4790]: E0313 20:28:26.377063 4790 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189c808aac8164d8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189c808aac8164d8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:28:06.365136088 +0000 UTC m=+17.386252009,LastTimestamp:2026-03-13 20:28:26.365554777 +0000 UTC m=+37.386670708,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.606045 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.659852 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661095 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.661683 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:26 crc kubenswrapper[4790]: I0313 20:28:26.869016 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.605651 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.884622 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.886027 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.888769 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" exitCode=255 Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.888830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e"} Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.889053 4790 scope.go:117] "RemoveContainer" containerID="dc4bfb568e7128b6a2356d653026522f42280d96739cf2c56a554b9a9a28fe41" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.889238 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.891483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.891627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.891718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:27 crc kubenswrapper[4790]: I0313 20:28:27.892611 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:27 crc kubenswrapper[4790]: E0313 20:28:27.892941 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:28 crc kubenswrapper[4790]: I0313 20:28:28.605633 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:28 crc kubenswrapper[4790]: I0313 20:28:28.893593 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:29 crc kubenswrapper[4790]: I0313 20:28:29.606979 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:29 crc kubenswrapper[4790]: E0313 20:28:29.716680 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:30 crc kubenswrapper[4790]: I0313 20:28:30.606296 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.297320 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.299334 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:31 crc kubenswrapper[4790]: E0313 20:28:31.304992 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:31 crc kubenswrapper[4790]: E0313 20:28:31.305041 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:31 crc kubenswrapper[4790]: I0313 20:28:31.606531 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.605368 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.801737 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.801943 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803157 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:32 crc kubenswrapper[4790]: I0313 20:28:32.803655 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:32 crc kubenswrapper[4790]: E0313 20:28:32.803817 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.368763 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.368974 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.370440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.370484 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.370495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.374221 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.607617 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.911087 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.912415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.912450 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:33 crc kubenswrapper[4790]: I0313 20:28:33.912463 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:34 crc kubenswrapper[4790]: I0313 20:28:34.604955 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:35 crc kubenswrapper[4790]: W0313 20:28:35.012428 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 20:28:35 crc kubenswrapper[4790]: E0313 20:28:35.012515 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:35 crc kubenswrapper[4790]: W0313 20:28:35.022250 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 20:28:35 crc kubenswrapper[4790]: E0313 20:28:35.022324 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:35 crc kubenswrapper[4790]: I0313 20:28:35.606464 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:36 crc kubenswrapper[4790]: I0313 20:28:36.605478 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.605527 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.778719 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.779011 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.780611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.780661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.780675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:37 crc kubenswrapper[4790]: I0313 20:28:37.781304 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:37 crc kubenswrapper[4790]: E0313 20:28:37.781531 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:37 crc kubenswrapper[4790]: W0313 20:28:37.985606 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 20:28:37 crc kubenswrapper[4790]: E0313 20:28:37.985729 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.305903 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307495 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307549 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.307602 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:38 crc kubenswrapper[4790]: E0313 20:28:38.310091 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:38 crc kubenswrapper[4790]: E0313 20:28:38.310414 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:38 crc kubenswrapper[4790]: I0313 20:28:38.607017 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:39 crc kubenswrapper[4790]: I0313 20:28:39.605677 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:39 crc kubenswrapper[4790]: E0313 20:28:39.716892 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:40 crc kubenswrapper[4790]: I0313 20:28:40.605414 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:41 crc kubenswrapper[4790]: W0313 20:28:41.188333 4790 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:41 crc kubenswrapper[4790]: E0313 20:28:41.188407 4790 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 20:28:41 crc kubenswrapper[4790]: I0313 20:28:41.605539 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:42 crc kubenswrapper[4790]: I0313 20:28:42.608019 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.608650 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.669825 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.670784 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.672588 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.672642 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:43 crc kubenswrapper[4790]: I0313 20:28:43.672661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:44 crc kubenswrapper[4790]: I0313 20:28:44.606250 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.310390 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.311747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.311812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.311824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.312140 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:45 crc kubenswrapper[4790]: E0313 20:28:45.315413 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:45 crc kubenswrapper[4790]: E0313 20:28:45.315465 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:45 crc kubenswrapper[4790]: I0313 20:28:45.606536 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:46 crc kubenswrapper[4790]: I0313 20:28:46.605256 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:47 crc kubenswrapper[4790]: I0313 20:28:47.605147 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:48 crc kubenswrapper[4790]: I0313 20:28:48.606821 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:49 crc kubenswrapper[4790]: I0313 20:28:49.604864 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:49 crc kubenswrapper[4790]: E0313 20:28:49.717691 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:50 crc kubenswrapper[4790]: I0313 20:28:50.608551 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.604831 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.658916 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660112 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.660685 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.978428 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.980836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab"} Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.981078 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.982529 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.982582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:51 crc kubenswrapper[4790]: I0313 20:28:51.982608 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.315591 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.316957 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.316997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.317009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.317030 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:52 crc kubenswrapper[4790]: E0313 20:28:52.320657 4790 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 13 20:28:52 crc kubenswrapper[4790]: E0313 20:28:52.321091 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.607948 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.801046 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.984976 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.985699 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987505 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" exitCode=255 Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab"} Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987969 4790 scope.go:117] "RemoveContainer" containerID="1d18cf69538bfc7de3613ae38b728c9f3d0e38ca99b39fb09f625bd27c4e542e" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.987752 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.989094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.989360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.990357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:52 crc kubenswrapper[4790]: I0313 20:28:52.991419 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:28:52 crc kubenswrapper[4790]: E0313 20:28:52.991694 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.605561 4790 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.991248 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.993981 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.994830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.994869 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.994883 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:53 crc kubenswrapper[4790]: I0313 20:28:53.995399 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:28:53 crc kubenswrapper[4790]: E0313 20:28:53.995551 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.219397 4790 csr.go:261] certificate signing request csr-72vmj is approved, waiting to be issued Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.231016 4790 csr.go:257] certificate signing request csr-72vmj is issued Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.262313 4790 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 20:28:54 crc kubenswrapper[4790]: I0313 20:28:54.452474 4790 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 20:28:55 crc kubenswrapper[4790]: I0313 20:28:55.233684 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-03 12:34:24.682533494 +0000 UTC Mar 13 20:28:55 crc kubenswrapper[4790]: I0313 20:28:55.234026 4790 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6352h5m29.448512437s for next certificate rotation Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.779069 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.779256 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.780702 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.780738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.780754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:57 crc kubenswrapper[4790]: I0313 20:28:57.781291 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:28:57 crc kubenswrapper[4790]: E0313 20:28:57.781497 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.321315 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.322910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.322959 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.322977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.323093 4790 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.331236 4790 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.331591 4790 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.331628 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.336741 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.353922 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360490 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.360514 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.369597 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377205 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.377215 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.386146 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392841 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:28:59 crc kubenswrapper[4790]: I0313 20:28:59.392884 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:28:59Z","lastTransitionTime":"2026-03-13T20:28:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.401394 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.401544 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.401572 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.501755 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.601957 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.702674 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.717837 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.803409 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:28:59 crc kubenswrapper[4790]: E0313 20:28:59.904038 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.004130 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.104644 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.205502 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.306336 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.406893 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.507798 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.608856 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.709983 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.811117 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:00 crc kubenswrapper[4790]: E0313 20:29:00.912138 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.013199 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.114302 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.215263 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.316201 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.416990 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.517524 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.618466 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.719508 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.819933 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:01 crc kubenswrapper[4790]: E0313 20:29:01.920887 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.021277 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.122332 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.223138 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.323743 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.424419 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: I0313 20:29:02.450325 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.524926 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.625788 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.726818 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.827614 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:02 crc kubenswrapper[4790]: E0313 20:29:02.928483 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.029731 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.130615 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.231769 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.331892 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.432807 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.534016 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.635057 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.736691 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.837719 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:03 crc kubenswrapper[4790]: E0313 20:29:03.938427 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.039073 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.139328 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.239841 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.341095 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.442053 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.542738 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.643459 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.744043 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.845269 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:04 crc kubenswrapper[4790]: E0313 20:29:04.946407 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.046757 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.147550 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.248028 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.349275 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.450149 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.551289 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.652191 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.752505 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.853429 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:05 crc kubenswrapper[4790]: E0313 20:29:05.953655 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.054196 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.155031 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.255429 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.355754 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.456346 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.556974 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.657648 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.758468 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.858833 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:06 crc kubenswrapper[4790]: E0313 20:29:06.959135 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.060133 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.160762 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.260980 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.361452 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.462047 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.562729 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.663761 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.764113 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.864914 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:07 crc kubenswrapper[4790]: E0313 20:29:07.966104 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.067037 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.167796 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.268236 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.368552 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.469799 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.570896 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.672018 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.773097 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.874427 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:08 crc kubenswrapper[4790]: E0313 20:29:08.975749 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.076198 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.177230 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.278617 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.379719 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.480630 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.582265 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.683367 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.718505 4790 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.764612 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769616 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769651 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769663 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.769693 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.779117 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.783888 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.794014 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797810 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.797906 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.809871 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.813958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814005 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:09 crc kubenswrapper[4790]: I0313 20:29:09.814052 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:09Z","lastTransitionTime":"2026-03-13T20:29:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.825820 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.825974 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.826015 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:09 crc kubenswrapper[4790]: E0313 20:29:09.926144 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.026782 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.127532 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.227728 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.328029 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.429298 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.530234 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.631262 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.731480 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.831987 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:10 crc kubenswrapper[4790]: E0313 20:29:10.932930 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.033315 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.134427 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.235554 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.335963 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.436999 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.537301 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.638528 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.739445 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.840546 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:11 crc kubenswrapper[4790]: E0313 20:29:11.941504 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.042726 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.143071 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.243526 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.344014 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.445154 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.546287 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.647044 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.659760 4790 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.660990 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.661026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.661036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:12 crc kubenswrapper[4790]: I0313 20:29:12.661571 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.661736 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.747984 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.848702 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:12 crc kubenswrapper[4790]: E0313 20:29:12.949818 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.050872 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.151435 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.252011 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.353045 4790 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.359955 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.361762 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.455860 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.455939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.455964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.456020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.456043 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558791 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558859 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.558919 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.626695 4790 apiserver.go:52] "Watching apiserver" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.640169 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.641594 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-image-registry/node-ca-9tpww","openshift-machine-config-operator/machine-config-daemon-drtsx","openshift-multus/multus-additional-cni-plugins-wq8kp","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75","openshift-multus/network-metrics-daemon-mnf26","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-x4d2p","openshift-network-node-identity/network-node-identity-vrzqb","openshift-multus/multus-x2tjg","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.642263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.642269 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.642728 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643178 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643252 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643308 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.643782 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643794 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.644235 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.644251 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.643937 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646303 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646336 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646435 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646464 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646758 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.646770 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647454 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647465 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647484 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.647859 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648035 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648141 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648215 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.648307 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.650951 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.653895 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654180 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654437 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654515 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.654719 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.655431 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.655875 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.656860 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.656937 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.657008 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.656882 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.657203 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.657435 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.658553 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.658598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.658886 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.659020 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.660997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661032 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661057 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661639 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661795 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661843 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661857 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.661504 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662020 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662083 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.662175 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.680973 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.697110 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.701751 4790 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.712407 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.726842 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.747988 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.757716 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763243 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.763303 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.765013 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.772826 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.782223 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784643 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784766 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784801 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784824 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784869 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784889 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784908 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784926 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784944 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784961 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.784981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785002 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785058 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785075 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785110 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785129 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785230 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785286 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785396 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785416 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785537 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785558 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785594 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785613 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785653 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785674 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785695 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785717 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785743 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785761 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785778 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785797 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785832 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785849 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785868 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785891 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785910 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785930 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785950 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785989 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786050 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786069 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786089 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786152 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786174 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786210 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786227 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786283 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786340 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786366 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786405 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786423 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786442 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786460 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786478 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786495 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786534 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786555 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786615 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786670 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786711 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786732 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786754 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786794 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786830 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786847 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786864 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786921 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786961 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787002 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787024 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787081 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787099 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787119 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787135 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787155 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787212 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787228 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787245 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787287 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787312 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787372 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787413 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787434 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787478 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787501 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787547 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787570 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787591 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787611 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787629 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787647 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787669 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787707 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787725 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787743 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787763 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787785 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787809 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787834 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787870 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787890 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787927 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787949 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787968 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788047 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788088 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785160 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788114 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785598 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788134 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.785950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786015 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786307 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786344 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786619 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.786956 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787289 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787623 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787690 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.787964 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788605 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788640 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788668 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.788765 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.288746633 +0000 UTC m=+85.309862644 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789019 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789370 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789440 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789499 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.789746 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790040 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790282 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790632 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.790706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.791540 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.791680 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.791992 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792162 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792524 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.788173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792707 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792734 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792760 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792790 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792813 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792835 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792855 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792875 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792895 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792959 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792978 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.792998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793017 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793049 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793061 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793107 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793162 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793290 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793348 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793357 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793395 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793456 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793484 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793508 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793531 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793813 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmkvj\" (UniqueName: \"kubernetes.io/projected/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-kube-api-access-pmkvj\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793906 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.793957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58464a30-7f56-4e13-894e-e53498a85637-mcd-auth-proxy-config\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cni-binary-copy\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-socket-dir-parent\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794211 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58464a30-7f56-4e13-894e-e53498a85637-rootfs\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794258 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-hostroot\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794242 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-daemon-config\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794338 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g48h\" (UniqueName: \"kubernetes.io/projected/c54336a0-5a12-4bf9-9807-337dd352fdb6-kube-api-access-7g48h\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-system-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794448 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794471 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cnibin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794488 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794532 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-bin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794585 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05405fad-1758-412e-b3ab-9714a604b207-host\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794631 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-cnibin\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794715 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794723 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-multus-certs\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794774 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794859 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-os-release\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794908 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-conf-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794952 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.794997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58464a30-7f56-4e13-894e-e53498a85637-proxy-tls\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-system-cni-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795057 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7thmq\" (UniqueName: \"kubernetes.io/projected/05405fad-1758-412e-b3ab-9714a604b207-kube-api-access-7thmq\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795184 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795226 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h76wc\" (UniqueName: \"kubernetes.io/projected/96d699b6-dfba-4b76-b3e8-0480527aa386-kube-api-access-h76wc\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2vjb\" (UniqueName: \"kubernetes.io/projected/58464a30-7f56-4e13-894e-e53498a85637-kube-api-access-h2vjb\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-kubelet\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795421 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6x7z\" (UniqueName: \"kubernetes.io/projected/58c65c62-097b-4179-9ada-1627afa9fef2-kube-api-access-w6x7z\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795573 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-etc-kubernetes\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795630 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795656 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/58c65c62-097b-4179-9ada-1627afa9fef2-hosts-file\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f8e0711-7595-4580-b702-558512c33395-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795727 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795760 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-multus\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795778 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-netns\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795814 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795861 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795905 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05405fad-1758-412e-b3ab-9714a604b207-serviceca\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795941 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-binary-copy\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796013 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq7qg\" (UniqueName: \"kubernetes.io/projected/2f8e0711-7595-4580-b702-558512c33395-kube-api-access-fq7qg\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-os-release\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796104 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796154 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796627 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796654 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796681 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796710 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796735 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796761 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796785 4790 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796808 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796830 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796853 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796875 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796898 4790 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796925 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796946 4790 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796969 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795342 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.795424 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796169 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796232 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.796871 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797130 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797141 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797154 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797324 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797726 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.797917 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798069 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798308 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798609 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.798984 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799139 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799157 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799830 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800122 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800150 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800170 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800187 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800251 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.800449 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799324 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.799361 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811819 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811846 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811874 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811898 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811920 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811941 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811962 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.811983 4790 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812004 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812025 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812048 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812069 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812092 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812114 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812134 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812155 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812176 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812198 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812218 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812239 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812259 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812280 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812302 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812323 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812344 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.812365 4790 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822719 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822760 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822778 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.822853 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.322830968 +0000 UTC m=+85.343946859 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.823925 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.823992 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.824023 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.824194 4790 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.824341 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.324098884 +0000 UTC m=+85.345214805 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.824463 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.824671 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825092 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825575 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825773 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825935 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.825919 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826180 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826476 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826777 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826838 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826921 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.826981 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827120 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827123 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827134 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827553 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827851 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.827885 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827888 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.827908 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828044 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.828066 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.828165 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.328139149 +0000 UTC m=+85.349255100 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.828257 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.328212541 +0000 UTC m=+85.349328542 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828281 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828316 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828618 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.828968 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.829330 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.829391 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830046 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830121 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830076 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830345 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830551 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.830735 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831263 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831158 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831444 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831751 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831839 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.831950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832019 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832607 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832602 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.832933 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.833188 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.833591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.833765 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.834536 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.837198 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.837541 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.838061 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.839844 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.840998 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.842300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.842731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.843047 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.843295 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.843530 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.844414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.844531 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.844778 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.847433 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.849303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.854856 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.855731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856070 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856183 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856484 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.856730 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857048 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857110 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857266 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857335 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.857423 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858435 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858519 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858671 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858983 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.858913 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.859942 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860095 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860237 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.860827 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861128 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861175 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861497 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.861782 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862227 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862480 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862615 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862667 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.862699 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.864848 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.864969 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.865049 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.865450 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.865684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.866052 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.866284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.866971 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867187 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867305 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867350 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.867986 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868013 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868120 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868356 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868366 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868393 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868175 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868404 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868601 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868895 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.868979 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.876818 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.881124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.887480 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.890317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.894590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7g48h\" (UniqueName: \"kubernetes.io/projected/c54336a0-5a12-4bf9-9807-337dd352fdb6-kube-api-access-7g48h\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-system-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913560 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913630 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-k8s-cni-cncf-io\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913672 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cnibin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913794 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05405fad-1758-412e-b3ab-9714a604b207-host\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913870 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-system-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/05405fad-1758-412e-b3ab-9714a604b207-host\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cnibin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-cnibin\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.913824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-cnibin\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-bin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914778 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-os-release\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914816 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-multus-certs\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914967 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915012 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-os-release\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58464a30-7f56-4e13-894e-e53498a85637-proxy-tls\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915107 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915157 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-cni-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-system-cni-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-system-cni-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915211 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-conf-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.915226 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915237 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-conf-dir\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: E0313 20:29:13.915306 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:14.415283652 +0000 UTC m=+85.436399573 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915234 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-multus-certs\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7thmq\" (UniqueName: \"kubernetes.io/projected/05405fad-1758-412e-b3ab-9714a604b207-kube-api-access-7thmq\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915409 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h76wc\" (UniqueName: \"kubernetes.io/projected/96d699b6-dfba-4b76-b3e8-0480527aa386-kube-api-access-h76wc\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915561 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-kubelet\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915660 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6x7z\" (UniqueName: \"kubernetes.io/projected/58c65c62-097b-4179-9ada-1627afa9fef2-kube-api-access-w6x7z\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915784 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-env-overrides\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915803 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-kubelet\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.915936 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916051 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.914901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2vjb\" (UniqueName: \"kubernetes.io/projected/58464a30-7f56-4e13-894e-e53498a85637-kube-api-access-h2vjb\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-etc-kubernetes\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-etc-kubernetes\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916433 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/96d699b6-dfba-4b76-b3e8-0480527aa386-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916542 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/58c65c62-097b-4179-9ada-1627afa9fef2-hosts-file\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f8e0711-7595-4580-b702-558512c33395-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-multus\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/58c65c62-097b-4179-9ada-1627afa9fef2-hosts-file\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.916824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917226 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2f8e0711-7595-4580-b702-558512c33395-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917275 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-multus\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-netns\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05405fad-1758-412e-b3ab-9714a604b207-serviceca\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-binary-copy\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq7qg\" (UniqueName: \"kubernetes.io/projected/2f8e0711-7595-4580-b702-558512c33395-kube-api-access-fq7qg\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917635 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-os-release\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917690 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmkvj\" (UniqueName: \"kubernetes.io/projected/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-kube-api-access-pmkvj\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917796 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917862 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58464a30-7f56-4e13-894e-e53498a85637-mcd-auth-proxy-config\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917903 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-socket-dir-parent\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917963 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58464a30-7f56-4e13-894e-e53498a85637-rootfs\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cni-binary-copy\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918045 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-hostroot\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918079 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-daemon-config\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918116 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-run-netns\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/58464a30-7f56-4e13-894e-e53498a85637-rootfs\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-socket-dir-parent\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.917904 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.918412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.919025 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/58464a30-7f56-4e13-894e-e53498a85637-mcd-auth-proxy-config\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-hostroot\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920426 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-os-release\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.920544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921712 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921844 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921924 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.921998 4790 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922073 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922147 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922237 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922315 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922458 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922566 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922723 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922811 4790 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922891 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.922971 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923050 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923137 4790 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923211 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923417 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923515 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923590 4790 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923660 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923730 4790 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923832 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923913 4790 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.923986 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.924064 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.924133 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.924968 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925078 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925152 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925228 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925299 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925369 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925483 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925561 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925635 4790 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925706 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925806 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925881 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.925956 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926027 4790 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926105 4790 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926180 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926251 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926321 4790 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926418 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926508 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926582 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926660 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926731 4790 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926801 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926881 4790 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.926956 4790 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927041 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927122 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927202 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927282 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927588 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927680 4790 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927752 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927836 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.927915 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928000 4790 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928082 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928156 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928227 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928304 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928395 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928498 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928574 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928657 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928744 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928827 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928907 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.928980 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929049 4790 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929123 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929196 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929268 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929352 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929447 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929519 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929598 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929677 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929747 4790 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929842 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929916 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.929985 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930060 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930135 4790 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930222 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930295 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930366 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930506 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930543 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930602 4790 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930617 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930629 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930650 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930664 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930677 4790 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930690 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930702 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930720 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930731 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930744 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930757 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930769 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930781 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930792 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930804 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930820 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930842 4790 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930854 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930864 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930873 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930883 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930894 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930903 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930911 4790 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930923 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930932 4790 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930940 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930952 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930961 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930970 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930978 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930986 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930994 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931005 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931013 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931022 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931031 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931039 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931053 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931062 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931070 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931099 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931110 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931119 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931131 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931142 4790 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931156 4790 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931167 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931180 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931190 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931199 4790 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931210 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931219 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.931227 4790 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.930488 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-host-var-lib-cni-bin\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.932080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.932293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/05405fad-1758-412e-b3ab-9714a604b207-serviceca\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.932872 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-cni-binary-copy\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.936594 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7g48h\" (UniqueName: \"kubernetes.io/projected/c54336a0-5a12-4bf9-9807-337dd352fdb6-kube-api-access-7g48h\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.937600 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.937705 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h76wc\" (UniqueName: \"kubernetes.io/projected/96d699b6-dfba-4b76-b3e8-0480527aa386-kube-api-access-h76wc\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.938269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/58464a30-7f56-4e13-894e-e53498a85637-proxy-tls\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.938721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-multus-daemon-config\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.939181 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmkvj\" (UniqueName: \"kubernetes.io/projected/207e7f49-094a-4e59-a8ff-9eacd8d6fe2a-kube-api-access-pmkvj\") pod \"multus-x2tjg\" (UID: \"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\") " pod="openshift-multus/multus-x2tjg" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.939719 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/96d699b6-dfba-4b76-b3e8-0480527aa386-cni-binary-copy\") pod \"multus-additional-cni-plugins-wq8kp\" (UID: \"96d699b6-dfba-4b76-b3e8-0480527aa386\") " pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.940154 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2f8e0711-7595-4580-b702-558512c33395-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.940179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.942678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2vjb\" (UniqueName: \"kubernetes.io/projected/58464a30-7f56-4e13-894e-e53498a85637-kube-api-access-h2vjb\") pod \"machine-config-daemon-drtsx\" (UID: \"58464a30-7f56-4e13-894e-e53498a85637\") " pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.943138 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq7qg\" (UniqueName: \"kubernetes.io/projected/2f8e0711-7595-4580-b702-558512c33395-kube-api-access-fq7qg\") pod \"ovnkube-control-plane-749d76644c-lgs75\" (UID: \"2f8e0711-7595-4580-b702-558512c33395\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.943804 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"ovnkube-node-gz4fj\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.945274 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7thmq\" (UniqueName: \"kubernetes.io/projected/05405fad-1758-412e-b3ab-9714a604b207-kube-api-access-7thmq\") pod \"node-ca-9tpww\" (UID: \"05405fad-1758-412e-b3ab-9714a604b207\") " pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.947242 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6x7z\" (UniqueName: \"kubernetes.io/projected/58c65c62-097b-4179-9ada-1627afa9fef2-kube-api-access-w6x7z\") pod \"node-resolver-x4d2p\" (UID: \"58c65c62-097b-4179-9ada-1627afa9fef2\") " pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.971116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972647 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.972705 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:13Z","lastTransitionTime":"2026-03-13T20:29:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.980171 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.989751 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 13 20:29:13 crc kubenswrapper[4790]: W0313 20:29:13.997445 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58 WatchSource:0}: Error finding container 1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58: Status 404 returned error can't find the container with id 1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58 Mar 13 20:29:13 crc kubenswrapper[4790]: I0313 20:29:13.999506 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-x4d2p" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.006293 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.009080 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c WatchSource:0}: Error finding container 884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c: Status 404 returned error can't find the container with id 884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.014564 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.023773 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.031346 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58c65c62_097b_4179_9ada_1627afa9fef2.slice/crio-05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed WatchSource:0}: Error finding container 05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed: Status 404 returned error can't find the container with id 05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.033124 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c9dff4_5508_4391_bb03_6710c2b9f3b5.slice/crio-7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0 WatchSource:0}: Error finding container 7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0: Status 404 returned error can't find the container with id 7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0 Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.040762 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.048913 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-x2tjg" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.050342 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.054450 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x4d2p" event={"ID":"58c65c62-097b-4179-9ada-1627afa9fef2","Type":"ContainerStarted","Data":"05fa2bda765ac8792bd63f0e6656247d0b0d37fe60b0b40a384a5685853fc4ed"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.056187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1b716c9f8ad89e06f510395191784b35a518cc42d7ef28f37488b4a947a23a58"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.057438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-9tpww" Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.058357 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96d699b6_dfba_4b76_b3e8_0480527aa386.slice/crio-773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b WatchSource:0}: Error finding container 773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b: Status 404 returned error can't find the container with id 773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.058558 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"6a01ef179717ba531d8bd41a0999e48b75a6c0277adf22c04cc1a853f2ae431b"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.061951 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"884e0948e2f90d5908057cf0eca253fb535db59e2880cef0470e5c091aa6e93c"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.076916 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.084316 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58464a30_7f56_4e13_894e_e53498a85637.slice/crio-d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939 WatchSource:0}: Error finding container d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939: Status 404 returned error can't find the container with id d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939 Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.110003 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05405fad_1758_412e_b3ab_9714a604b207.slice/crio-cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b WatchSource:0}: Error finding container cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b: Status 404 returned error can't find the container with id cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b Mar 13 20:29:14 crc kubenswrapper[4790]: W0313 20:29:14.141793 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod207e7f49_094a_4e59_a8ff_9eacd8d6fe2a.slice/crio-c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15 WatchSource:0}: Error finding container c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15: Status 404 returned error can't find the container with id c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15 Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181285 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.181324 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289306 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289325 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.289336 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334581 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334764 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.334740955 +0000 UTC m=+86.355856846 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334812 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334871 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.334855628 +0000 UTC m=+86.355971609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334797 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334891 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.334923 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.33491614 +0000 UTC m=+86.356032031 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.334981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335062 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335073 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335082 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335106 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.335099915 +0000 UTC m=+86.356215806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335149 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335159 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335166 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.335184 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.335178107 +0000 UTC m=+86.356293998 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.391936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.391982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.391995 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.392017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.392032 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.436160 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.436342 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: E0313 20:29:14.436457 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:15.436427913 +0000 UTC m=+86.457543814 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495304 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495346 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.495412 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.597952 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.700521 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.803718 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906689 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:14 crc kubenswrapper[4790]: I0313 20:29:14.906762 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:14Z","lastTransitionTime":"2026-03-13T20:29:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008742 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.008758 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.068166 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-x4d2p" event={"ID":"58c65c62-097b-4179-9ada-1627afa9fef2","Type":"ContainerStarted","Data":"e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.073426 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.076156 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tpww" event={"ID":"05405fad-1758-412e-b3ab-9714a604b207","Type":"ContainerStarted","Data":"2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.076179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-9tpww" event={"ID":"05405fad-1758-412e-b3ab-9714a604b207","Type":"ContainerStarted","Data":"cd61050d42c84009eea1306cad91215cfdb6c00ea85a4a51693551618085766b"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.078432 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09" exitCode=0 Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.078465 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.078495 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerStarted","Data":"773eee4aecd56d69b6cc681691e6af32bd27dd70ec94afc0332c51d79c84353b"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.081406 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.082915 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" exitCode=0 Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.082995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.091588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" event={"ID":"2f8e0711-7595-4580-b702-558512c33395","Type":"ContainerStarted","Data":"c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.091647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" event={"ID":"2f8e0711-7595-4580-b702-558512c33395","Type":"ContainerStarted","Data":"cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.091659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" event={"ID":"2f8e0711-7595-4580-b702-558512c33395","Type":"ContainerStarted","Data":"b2b5c47a36a821bffe16ccd4f1169622238f672dacf2f9863c33e35119b7c278"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.093283 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.093349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.095873 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.095918 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"c49eedb2d76eaa4770def86510fe121a30b45fee5cb2a90f64616c5d795ddc15"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.098343 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.098403 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.098418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"d18401980d1289de2b5ff091072e4b0d4e6052cfd8baf7e6e1284f1774626939"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.100566 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111705 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.111730 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.115013 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.127037 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.138691 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.151281 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.164250 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.181219 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.197590 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213962 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.213987 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.219715 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.241498 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.254041 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.267724 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.282730 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.296638 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317210 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317220 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.317248 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.319814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.335130 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345123 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345227 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345304 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345277879 +0000 UTC m=+88.366393770 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.345343 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345359 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345398 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345414 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345424 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345433 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345422083 +0000 UTC m=+88.366538014 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345462 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345450524 +0000 UTC m=+88.366566405 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345496 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345507 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345508 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345521 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345528 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345522586 +0000 UTC m=+88.366638477 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.345544 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.345538077 +0000 UTC m=+88.366653968 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.347917 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.359958 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.381014 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.401759 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.417647 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419937 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.419963 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.430046 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.446874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.447082 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.447169 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:17.447146882 +0000 UTC m=+88.468262833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.448407 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.460814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.472199 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.483769 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.495447 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:15Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522762 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522771 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.522795 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.625369 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.659713 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659969 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.659705 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.660100 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.660184 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:15 crc kubenswrapper[4790]: E0313 20:29:15.660214 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.669932 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.670794 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.671615 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.672237 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.685919 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.686823 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.688243 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.689862 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.691540 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.692253 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.692967 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.694956 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.695889 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.698243 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.699859 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.702940 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.704062 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.706184 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.707136 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.708077 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.711104 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.711963 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.713048 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.714614 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.715307 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.716650 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.718461 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.718933 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.719661 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.720920 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.721420 4790 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.721536 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.725111 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.725685 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.726198 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729262 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729412 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729681 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.729625 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.730727 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.731284 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.732436 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.733113 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.733989 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.734573 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.735554 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.736507 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.736946 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.737824 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.738397 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.739470 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.740050 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.740605 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.741547 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.742065 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.743026 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.743486 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.743926 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832667 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.832690 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937405 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937458 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:15 crc kubenswrapper[4790]: I0313 20:29:15.937485 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:15Z","lastTransitionTime":"2026-03-13T20:29:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039609 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039658 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.039667 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.103334 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696" exitCode=0 Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.103429 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107711 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.107779 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.119467 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.130778 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.139196 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142505 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142517 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.142557 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.149416 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.163710 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.176751 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.193027 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.203653 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.214760 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.228714 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245342 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245324 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245446 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.245468 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.265694 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.279250 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.297241 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.311512 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:16Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347745 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347761 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.347771 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.449977 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450028 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.450077 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552171 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552236 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.552294 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655644 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.655736 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758073 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.758141 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.860934 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.860975 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.860985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.861000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.861011 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:16 crc kubenswrapper[4790]: I0313 20:29:16.964307 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:16Z","lastTransitionTime":"2026-03-13T20:29:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072346 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072373 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.072404 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.113411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.113475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.115287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.118644 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375" exitCode=0 Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.118683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.132565 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.144059 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.159284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175573 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175629 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175659 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.175595 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.185076 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.194439 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.204414 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.217346 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.227796 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.237453 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.249588 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.260135 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.270245 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281340 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281352 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.281622 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.294030 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.304483 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.314172 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.323645 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.332896 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.345991 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.360599 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.369920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370085 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.370119 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370225 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370249 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370269 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370314 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370299011 +0000 UTC m=+92.391414912 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370412 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370368403 +0000 UTC m=+92.391484294 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370459 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370469 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370477 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370498 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370491957 +0000 UTC m=+92.391607838 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370527 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370546 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.370539888 +0000 UTC m=+92.391655779 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370582 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.370602 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.37059661 +0000 UTC m=+92.391712501 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.374812 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.383664 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.389858 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.401230 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.412025 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.423105 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.435130 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.447484 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.464571 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.470583 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.470881 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.470932 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:21.470918248 +0000 UTC m=+92.492034139 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.479246 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:17Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.485948 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.485991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.486000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.486015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.486024 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.588755 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.588999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.589009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.589024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.589037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659593 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.659596 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.659754 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.659986 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.660175 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:17 crc kubenswrapper[4790]: E0313 20:29:17.660218 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692485 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.692498 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795080 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.795094 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898125 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898135 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:17 crc kubenswrapper[4790]: I0313 20:29:17.898161 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:17Z","lastTransitionTime":"2026-03-13T20:29:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000415 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000481 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.000492 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108520 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108548 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.108559 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.124839 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460" exitCode=0 Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.124949 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.147055 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.159965 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.180481 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.196887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.206807 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210337 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210364 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.210392 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.219236 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.228999 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.238403 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.250998 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.264819 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.283003 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.294273 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.312085 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313223 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.313304 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.331281 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.344878 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:18Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.416931 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.417002 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521224 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.521255 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624044 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.624107 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.726635 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829409 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829468 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829484 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.829535 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935135 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:18 crc kubenswrapper[4790]: I0313 20:29:18.935161 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:18Z","lastTransitionTime":"2026-03-13T20:29:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037701 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.037949 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.136057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.144266 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.147708 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283" exitCode=0 Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.147749 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.167405 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.184561 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.202623 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.226814 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.238557 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248230 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248289 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248308 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.248317 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.249759 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.274867 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.291552 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.313222 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.329397 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.342455 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350589 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350636 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.350681 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.355010 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.364687 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.378050 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.389168 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453818 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.453893 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557488 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557613 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.557629 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658867 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658879 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.658952 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659471 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659542 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:19 crc kubenswrapper[4790]: E0313 20:29:19.659617 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.660293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.675410 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.687239 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.698769 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.711606 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.725439 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.741482 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763439 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.763472 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.764578 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.781134 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.794681 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.805635 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.816630 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.828613 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.843113 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.858012 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865602 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.865865 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.871469 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968347 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968405 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968421 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:19 crc kubenswrapper[4790]: I0313 20:29:19.968446 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:19Z","lastTransitionTime":"2026-03-13T20:29:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070437 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.070446 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.155274 4790 generic.go:334] "Generic (PLEG): container finished" podID="96d699b6-dfba-4b76-b3e8-0480527aa386" containerID="e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b" exitCode=0 Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.155315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerDied","Data":"e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173234 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173244 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.173264 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.175198 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.200403 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.201131 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.214225 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.217971 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223772 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.223812 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.226530 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.237301 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.239151 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244102 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244142 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244158 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.244168 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.250394 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.257931 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.261558 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.266685 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.271801 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.278523 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281650 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281680 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281690 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.281714 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.286500 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.295340 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: E0313 20:29:20.295529 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296834 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.296877 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.299528 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.314548 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.328422 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.341546 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.354865 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.368945 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:20Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.400782 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.503179 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605049 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.605072 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707442 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707889 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707912 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.707931 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810051 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810065 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.810074 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912332 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912396 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912409 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:20 crc kubenswrapper[4790]: I0313 20:29:20.912438 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:20Z","lastTransitionTime":"2026-03-13T20:29:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014296 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014339 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014348 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014362 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.014418 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116214 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.116294 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.161532 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" event={"ID":"96d699b6-dfba-4b76-b3e8-0480527aa386","Type":"ContainerStarted","Data":"312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.166553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.167365 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.167451 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.167540 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.175199 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.190466 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.201350 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.206109 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.206167 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.210823 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218451 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.218500 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.224505 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.234397 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.247342 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.262939 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.273637 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.287159 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.298054 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.310437 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.320501 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.321297 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.337128 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.352641 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.364828 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.377044 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.392343 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.406093 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.410960 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411135 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.41110746 +0000 UTC m=+100.432223361 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411180 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411283 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411308 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411342 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.411283 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411408 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411416 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411431 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411455 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411469 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411473 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.41145172 +0000 UTC m=+100.432567651 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411502 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.411490201 +0000 UTC m=+100.432606132 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411524 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.411513552 +0000 UTC m=+100.432629493 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.411557 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.411545672 +0000 UTC m=+100.432661603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.421767 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423291 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423303 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.423311 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.440165 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.452736 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.468316 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.482874 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.505301 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.512493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.512617 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.512695 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:29.512677735 +0000 UTC m=+100.533793636 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.525170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.526908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527153 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.527209 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.539412 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.552247 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.563335 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.574943 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:21Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.628941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629200 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629274 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629369 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.629473 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659705 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.659829 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659897 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659954 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.659972 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.660040 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.660184 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:21 crc kubenswrapper[4790]: E0313 20:29:21.660279 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731953 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.731982 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834656 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.834684 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937883 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:21 crc kubenswrapper[4790]: I0313 20:29:21.937922 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:21Z","lastTransitionTime":"2026-03-13T20:29:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.040762 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143422 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.143432 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245509 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245565 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245577 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.245605 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348744 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348832 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.348844 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.451780 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554196 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554241 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554268 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.554278 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656165 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.656186 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758198 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.758261 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860325 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860398 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.860414 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962832 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:22 crc kubenswrapper[4790]: I0313 20:29:22.962900 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:22Z","lastTransitionTime":"2026-03-13T20:29:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065067 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065127 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065149 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.065162 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167270 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.167366 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270119 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.270192 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.372335 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474961 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.474995 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577763 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577821 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.577830 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.659435 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.659515 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.659528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.660092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660257 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660450 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:23 crc kubenswrapper[4790]: E0313 20:29:23.660707 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680175 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.680237 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.782912 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885858 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885907 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.885961 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988428 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:23 crc kubenswrapper[4790]: I0313 20:29:23.988496 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:23Z","lastTransitionTime":"2026-03-13T20:29:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.091509 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.177008 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/0.log" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.180518 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840" exitCode=1 Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.180584 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.181824 4790 scope.go:117] "RemoveContainer" containerID="d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193408 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193483 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.193493 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.195755 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.217355 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.233449 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.247060 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.261284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.276089 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.291368 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297888 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297979 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.297993 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.305498 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.319826 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.334142 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.345596 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.359269 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.370877 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.386345 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:23Z\\\",\\\"message\\\":\\\"0:29:23.322508 6693 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:29:23.322517 6693 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:29:23.322524 6693 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:29:23.322595 6693 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:29:23.322648 6693 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 20:29:23.322660 6693 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 20:29:23.322691 6693 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 20:29:23.322737 6693 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 20:29:23.322747 6693 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:29:23.322735 6693 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 20:29:23.322765 6693 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 20:29:23.322770 6693 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 20:29:23.322751 6693 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 20:29:23.322797 6693 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:29:23.323000 6693 factory.go:656] Stopping watch factory\\\\nI0313 20:29:23.323035 6693 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.398817 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:24Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400156 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400165 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400179 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.400188 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502278 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.502287 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604530 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604541 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.604569 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707875 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707960 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.707972 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809729 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.809740 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913645 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913655 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:24 crc kubenswrapper[4790]: I0313 20:29:24.913682 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:24Z","lastTransitionTime":"2026-03-13T20:29:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016692 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.016738 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119632 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119673 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119682 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.119707 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.185678 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.186579 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/0.log" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.190285 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" exitCode=1 Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.190325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.190362 4790 scope.go:117] "RemoveContainer" containerID="d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.191513 4790 scope.go:117] "RemoveContainer" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.191805 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.205851 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.214498 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222562 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222602 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.222635 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.224533 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.234683 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.249363 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.263556 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.276810 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.291803 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.303981 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.315976 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326781 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.326838 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.354460 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.371493 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.385888 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.398884 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.418112 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:23Z\\\",\\\"message\\\":\\\"0:29:23.322508 6693 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:29:23.322517 6693 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:29:23.322524 6693 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:29:23.322595 6693 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:29:23.322648 6693 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 20:29:23.322660 6693 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 20:29:23.322691 6693 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 20:29:23.322737 6693 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 20:29:23.322747 6693 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:29:23.322735 6693 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 20:29:23.322765 6693 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 20:29:23.322770 6693 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 20:29:23.322751 6693 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 20:29:23.322797 6693 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:29:23.323000 6693 factory.go:656] Stopping watch factory\\\\nI0313 20:29:23.323035 6693 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:25Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.441312 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545143 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545205 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.545236 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648520 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648550 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.648561 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658940 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658959 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658955 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.658960 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659081 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659224 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659521 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:25 crc kubenswrapper[4790]: E0313 20:29:25.659608 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.751556 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854861 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854932 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.854980 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.957904 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.957968 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.957986 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.958008 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:25 crc kubenswrapper[4790]: I0313 20:29:25.958024 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:25Z","lastTransitionTime":"2026-03-13T20:29:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061248 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061297 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061307 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.061340 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164552 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.164663 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.197549 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266733 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.266743 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370280 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.370301 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.473879 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.576944 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577056 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.577075 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.679958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680019 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680029 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.680068 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.783346 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886027 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.886094 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988453 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988752 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:26 crc kubenswrapper[4790]: I0313 20:29:26.988821 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:26Z","lastTransitionTime":"2026-03-13T20:29:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091343 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.091398 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193730 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193746 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.193759 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296274 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296292 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296313 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.296330 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398367 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.398423 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.500910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.500973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.500985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.501003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.501014 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.603720 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659828 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659902 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.659978 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660188 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660347 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660564 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.660690 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.678126 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:27 crc kubenswrapper[4790]: E0313 20:29:27.678446 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.679551 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707134 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707161 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.707178 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810128 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810199 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.810227 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912769 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912852 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912875 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:27 crc kubenswrapper[4790]: I0313 20:29:27.912895 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:27Z","lastTransitionTime":"2026-03-13T20:29:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015782 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.015832 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.036814 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.120880 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121054 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121173 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.121282 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.206998 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:28 crc kubenswrapper[4790]: E0313 20:29:28.207307 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.223909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.223964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.223978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.224006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.224020 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326823 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.326851 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430419 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430494 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.430504 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.540911 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645170 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.645198 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.748846 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852011 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852053 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852079 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.852088 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954835 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:28 crc kubenswrapper[4790]: I0313 20:29:28.954848 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:28Z","lastTransitionTime":"2026-03-13T20:29:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058367 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058395 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058413 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.058425 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161685 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.161695 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.263709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.264619 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.367272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.367790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.368034 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.368232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.368530 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471301 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471329 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.471340 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.496827 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.496981 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.496954199 +0000 UTC m=+116.518070120 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497197 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497260 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.497244757 +0000 UTC m=+116.518360678 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497198 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497450 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497546 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.497526375 +0000 UTC m=+116.518642336 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.497643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497778 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497872 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497974 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498129 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.498118712 +0000 UTC m=+116.519234603 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.497898 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498310 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498424 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.498561 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.498530983 +0000 UTC m=+116.519646974 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573477 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573651 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.573724 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.598335 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.598528 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.598604 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:29:45.598586154 +0000 UTC m=+116.619702065 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659371 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659456 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659469 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.659491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.660118 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.659946 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.659781 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:29 crc kubenswrapper[4790]: E0313 20:29:29.660185 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676448 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676879 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.676949 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.690084 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.701931 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.711889 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.720936 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.736357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.752495 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.765137 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779688 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.779734 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.783282 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.799482 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.813315 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.837001 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d0887928b430f1d66be99cbe2ad22893fc680bd99931351299b685220f447840\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:23Z\\\",\\\"message\\\":\\\"0:29:23.322508 6693 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:29:23.322517 6693 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:29:23.322524 6693 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:29:23.322595 6693 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:29:23.322648 6693 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0313 20:29:23.322660 6693 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0313 20:29:23.322691 6693 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0313 20:29:23.322737 6693 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0313 20:29:23.322747 6693 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:29:23.322735 6693 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0313 20:29:23.322765 6693 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0313 20:29:23.322770 6693 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0313 20:29:23.322751 6693 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0313 20:29:23.322797 6693 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:29:23.323000 6693 factory.go:656] Stopping watch factory\\\\nI0313 20:29:23.323035 6693 ovnkube.go:599] Stopped ovnkube\\\\nI03\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.850842 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.863101 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.871998 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881839 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881850 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881865 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.881876 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.883728 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985335 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:29 crc kubenswrapper[4790]: I0313 20:29:29.985417 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:29Z","lastTransitionTime":"2026-03-13T20:29:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088202 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.088246 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190282 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190352 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.190492 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.292807 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293204 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293478 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.293924 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.396884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397219 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.397810 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501372 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501467 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.501530 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605141 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605555 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.605866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.606006 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.634748 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635482 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.635649 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.654887 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.660930 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.660997 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.661011 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.661026 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.661037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.679707 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685107 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685324 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685513 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.685854 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.702326 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707295 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707309 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707327 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.707338 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.722945 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727471 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727547 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727571 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.727633 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.743316 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:30Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:30 crc kubenswrapper[4790]: E0313 20:29:30.743505 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745697 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745755 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.745767 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849527 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849576 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849587 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.849616 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953186 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953272 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953317 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:30 crc kubenswrapper[4790]: I0313 20:29:30.953328 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:30Z","lastTransitionTime":"2026-03-13T20:29:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057149 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.057289 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161736 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.161811 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273578 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.273768 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.376877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.376966 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.376991 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.377019 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.377037 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480566 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480589 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480618 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.480638 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583479 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.583676 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660021 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660106 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660195 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660231 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.660502 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660633 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660710 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:31 crc kubenswrapper[4790]: E0313 20:29:31.660910 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687630 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687676 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687706 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.687719 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790336 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.790549 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.893983 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894047 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.894069 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997339 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997361 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:31 crc kubenswrapper[4790]: I0313 20:29:31.997405 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:31Z","lastTransitionTime":"2026-03-13T20:29:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.099987 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100036 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.100071 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202242 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.202252 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304808 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304828 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.304841 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.407825 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408061 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408095 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.408118 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510881 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510909 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.510920 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.613888 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.716978 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.717000 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820820 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820831 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820848 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.820863 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923895 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923965 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.923990 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:32 crc kubenswrapper[4790]: I0313 20:29:32.924004 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:32Z","lastTransitionTime":"2026-03-13T20:29:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.026941 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027015 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.027098 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.129964 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130020 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130046 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.130056 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.232939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.233067 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338714 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.338742 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441133 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441144 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441159 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.441169 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544958 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544973 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.544989 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648238 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648323 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648444 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.648469 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.658930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.658995 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.658938 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659312 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659474 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.659718 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:33 crc kubenswrapper[4790]: E0313 20:29:33.659983 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751498 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751556 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751575 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.751588 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854622 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.854660 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957475 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957489 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:33 crc kubenswrapper[4790]: I0313 20:29:33.957500 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:33Z","lastTransitionTime":"2026-03-13T20:29:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059557 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059566 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059581 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.059593 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162131 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162145 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.162154 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265138 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.265176 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367390 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367407 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.367417 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470647 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.470659 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573743 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.573774 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676100 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676139 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.676155 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778855 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778926 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778949 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.778966 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881267 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881288 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881314 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.881334 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983886 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:34 crc kubenswrapper[4790]: I0313 20:29:34.983919 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:34Z","lastTransitionTime":"2026-03-13T20:29:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086565 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086583 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086605 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.086625 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189607 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189678 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.189692 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292098 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292115 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.292127 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394215 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394264 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394279 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.394291 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497492 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497515 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.497565 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600855 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.600922 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660617 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660755 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.660894 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.660888 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.661052 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.661596 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:35 crc kubenswrapper[4790]: E0313 20:29:35.661784 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.677982 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703593 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.703675 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806157 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806205 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806231 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.806242 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910359 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910470 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:35 crc kubenswrapper[4790]: I0313 20:29:35.910512 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:35Z","lastTransitionTime":"2026-03-13T20:29:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014503 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014592 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014665 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.014730 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117466 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117539 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117558 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117584 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.117601 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.220892 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.220956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.220974 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.221003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.221023 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326040 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.326171 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429425 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429547 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.429606 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533189 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.533242 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637129 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637146 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637169 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.637188 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740905 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.740934 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.843985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844121 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.844141 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946605 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946696 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:36 crc kubenswrapper[4790]: I0313 20:29:36.946712 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:36Z","lastTransitionTime":"2026-03-13T20:29:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048814 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048921 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.048968 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152247 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152794 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.152867 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255884 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255936 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255950 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255969 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.255981 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359942 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359955 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359976 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.359990 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463055 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463286 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.463528 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.566160 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.566708 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.566902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.567221 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.567364 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659450 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.659551 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659626 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.659729 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.659361 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.659935 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:37 crc kubenswrapper[4790]: E0313 20:29:37.660039 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671041 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.671201 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775051 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775427 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775480 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.775713 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878579 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.878644 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982211 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982247 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982257 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982271 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:37 crc kubenswrapper[4790]: I0313 20:29:37.982283 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:37Z","lastTransitionTime":"2026-03-13T20:29:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.084999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.085150 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188147 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.188158 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291497 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291506 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291521 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.291531 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393568 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393579 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393596 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.393606 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497025 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497084 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497101 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497124 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.497140 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.599945 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.599985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.599993 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.600007 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.600019 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.660178 4790 scope.go:117] "RemoveContainer" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.676978 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.695158 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702681 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702722 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.702741 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.715914 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.734159 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.759401 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.773079 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.784978 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.797946 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.804919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805017 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805038 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.805051 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.810594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.823723 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.839439 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.853990 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.869298 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.882574 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.895828 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.906813 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908132 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908168 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908178 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.908202 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:38Z","lastTransitionTime":"2026-03-13T20:29:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:38 crc kubenswrapper[4790]: I0313 20:29:38.925115 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:38Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010832 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010842 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010856 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.010866 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112676 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.112714 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215300 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215362 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215401 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.215414 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.251557 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.253769 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.254143 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.265438 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.280372 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.290338 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.299538 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.311022 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317663 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317671 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317686 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.317696 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.322916 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.334395 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.346324 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.360113 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.373743 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.392049 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.407754 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419896 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.419932 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.430015 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.447119 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.469322 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.485167 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.500646 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.522956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.522999 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.523009 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.523024 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.523035 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625785 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.625823 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659259 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659362 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659480 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.659511 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659496 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659614 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659690 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:39 crc kubenswrapper[4790]: E0313 20:29:39.659748 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.682448 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.702046 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.719175 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729399 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729468 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729481 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729503 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.729517 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.735207 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.753815 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.783421 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.801265 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.821094 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831805 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831813 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.831840 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.853094 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.872241 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.885417 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.898078 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.910463 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.923756 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934280 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934322 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934331 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934344 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.934353 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:39Z","lastTransitionTime":"2026-03-13T20:29:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.941625 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.958165 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:39 crc kubenswrapper[4790]: I0313 20:29:39.977851 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:39Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037023 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037087 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.037098 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140819 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140901 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140910 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140925 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.140935 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244435 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244461 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.244515 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.260539 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.261455 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/1.log" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.265154 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" exitCode=1 Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.265227 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.265299 4790 scope.go:117] "RemoveContainer" containerID="e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.266489 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.266844 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.281911 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.298852 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.312466 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.323867 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.337770 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346867 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346914 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346929 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.346939 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.351273 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.365635 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.375390 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.391029 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.402599 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.418205 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.432248 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449829 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449846 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449868 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.449884 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.455163 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.468971 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.481069 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.499141 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e61913d5fbface0ec42012b915b03c669d97b4d72d2e8dbb4270a971eeb17367\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:24Z\\\",\\\"message\\\":\\\"mn _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:c94130be-172c-477c-88c4-40cc7eba30fe}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {eb8eef51-1a8d-43f9-ae2e-3b2cc00ded60}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:73135118-cf1b-4568-bd31-2f50308bf69d}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:24.978253 6832 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:29:24.978290 6832 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:29:24.978371 6832 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:24Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.511988 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552657 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552747 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.552757 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655754 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655822 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655838 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655854 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.655865 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758840 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758956 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.758972 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786431 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786441 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.786468 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.804307 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808216 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808254 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808265 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808281 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.808293 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.822838 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826902 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826922 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826940 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.826951 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.838554 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842136 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842181 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842195 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.842207 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.854976 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858582 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858611 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858619 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858632 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.858640 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.872422 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:40Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:40 crc kubenswrapper[4790]: E0313 20:29:40.872561 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874097 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874166 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874197 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.874212 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976368 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976402 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:40 crc kubenswrapper[4790]: I0313 20:29:40.976446 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:40Z","lastTransitionTime":"2026-03-13T20:29:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078636 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.078757 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181851 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181916 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.181954 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.271117 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.277276 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.277811 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286183 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.286704 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.304933 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.322589 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.350941 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.364300 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.376412 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389601 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389727 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389802 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.389829 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.393222 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.408637 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.422668 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.437600 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.451321 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.470402 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.485738 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492187 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492232 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492243 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492259 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.492269 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.501347 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.516724 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.527861 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.548481 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.567051 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:41Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594431 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594504 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594532 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.594550 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.658848 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.658974 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.659256 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.659739 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.659769 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.659971 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.660468 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:41 crc kubenswrapper[4790]: E0313 20:29:41.661629 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697256 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697334 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.697410 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800649 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800777 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.800795 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904648 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904719 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904736 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:41 crc kubenswrapper[4790]: I0313 20:29:41.904778 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:41Z","lastTransitionTime":"2026-03-13T20:29:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006836 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006890 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.006918 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.109590 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.109900 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.110246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.110585 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.110739 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213710 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213897 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.213975 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316533 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316596 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316614 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316637 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.316653 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419276 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419651 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419753 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.419840 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522474 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522516 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522528 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522542 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.522553 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625326 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625407 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625429 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625453 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.625469 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.671583 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728840 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728857 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728878 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.728897 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832209 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832269 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832293 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.832310 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935672 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935773 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935792 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935853 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:42 crc kubenswrapper[4790]: I0313 20:29:42.935875 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:42Z","lastTransitionTime":"2026-03-13T20:29:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038640 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038711 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038723 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.038748 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141766 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141798 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141824 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.141833 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244363 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244486 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244515 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.244538 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347031 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347077 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.347101 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450585 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450626 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450638 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450654 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.450668 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552796 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552815 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552838 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.552855 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655501 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655511 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655526 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.655537 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659801 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659867 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659905 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.659833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660023 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660241 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660651 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:43 crc kubenswrapper[4790]: E0313 20:29:43.660739 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.661570 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758194 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758255 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.758266 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861064 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861076 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861091 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.861102 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963826 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963870 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963885 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963906 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:43 crc kubenswrapper[4790]: I0313 20:29:43.963921 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:43Z","lastTransitionTime":"2026-03-13T20:29:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.067468 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170510 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170540 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.170551 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272830 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272864 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272877 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272893 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.272904 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.287749 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.289504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.289979 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.301511 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.318243 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.331522 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.345401 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.359272 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.374982 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375063 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375075 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375093 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.375110 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.374960 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.400615 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.415281 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.427877 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.446748 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.461034 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.473583 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477338 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477404 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477434 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.477444 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.484700 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.495549 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.506180 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.516289 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.527135 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.539959 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:44Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580764 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580793 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580803 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580816 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.580825 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.670573 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682704 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682739 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682751 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.682759 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785122 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785135 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.785160 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888190 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888233 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888245 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888261 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.888274 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991055 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991066 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:44 crc kubenswrapper[4790]: I0313 20:29:44.991092 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:44Z","lastTransitionTime":"2026-03-13T20:29:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093423 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093436 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093450 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.093459 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.198911 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199213 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199309 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.199337 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302152 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302275 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.302295 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405674 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405737 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.405828 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.509666 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.509738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.509756 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.510182 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.510239 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.572782 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.572927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.572990 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573003 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.572968419 +0000 UTC m=+148.594084310 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573040 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.573031541 +0000 UTC m=+148.594147542 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.573095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.573142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.573179 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573206 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573252 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.573236307 +0000 UTC m=+148.594352268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573306 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573321 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573332 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573364 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.57335731 +0000 UTC m=+148.594473301 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573478 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573513 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573533 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.573609 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.573590188 +0000 UTC m=+148.594706109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612319 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612350 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612357 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612370 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.612402 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659604 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.659735 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659835 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659889 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.659847 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.659994 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.660052 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.660102 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.673838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.674033 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: E0313 20:29:45.674242 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:30:17.674226624 +0000 UTC m=+148.695342515 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715318 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715410 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715496 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.715570 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818203 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818250 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818266 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.818276 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920873 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920887 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:45 crc kubenswrapper[4790]: I0313 20:29:45.920897 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:45Z","lastTransitionTime":"2026-03-13T20:29:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023006 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023039 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023062 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.023072 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125597 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125624 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125631 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125643 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.125652 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228111 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228167 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228180 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228193 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.228203 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331018 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331177 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331197 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331226 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.331245 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434472 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434600 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434662 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.434685 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536913 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536955 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536971 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536985 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.536995 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639328 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639418 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639433 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639452 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.639464 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741800 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741857 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741867 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741882 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.741891 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845287 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845355 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845369 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845417 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.845434 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948010 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948059 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948083 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:46 crc kubenswrapper[4790]: I0313 20:29:46.948098 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:46Z","lastTransitionTime":"2026-03-13T20:29:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050801 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050844 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050866 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.050890 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154050 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154060 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154074 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.154084 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257679 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257716 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257726 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257741 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.257755 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360154 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360164 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360182 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.360194 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463594 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463618 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463646 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.463673 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566048 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566090 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566099 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566113 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.566124 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659484 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659503 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.659632 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659507 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.659714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.659808 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.659937 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:47 crc kubenswrapper[4790]: E0313 20:29:47.660010 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667871 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667879 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667891 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.667900 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770687 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770698 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770715 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.770725 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873703 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873809 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.873829 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975872 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975918 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975928 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975952 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:47 crc kubenswrapper[4790]: I0313 20:29:47.975964 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:47Z","lastTransitionTime":"2026-03-13T20:29:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078732 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078784 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078795 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078812 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.078823 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181535 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181652 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181683 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.181703 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283667 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283705 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283718 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283735 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.283748 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386541 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386555 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.386563 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.489789 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.489919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.489972 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.490000 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.490016 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592707 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592767 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592776 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592790 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.592989 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696321 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696360 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696371 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696403 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.696415 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799096 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799454 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799584 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799709 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.799815 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902554 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902617 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902627 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902642 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:48 crc kubenswrapper[4790]: I0313 20:29:48.902653 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:48Z","lastTransitionTime":"2026-03-13T20:29:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004351 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004430 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004456 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.004466 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106402 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106450 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106460 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106476 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.106488 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209081 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209117 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209126 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209150 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.209161 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.310827 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311071 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311188 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.311241 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413499 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413508 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413522 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.413534 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516277 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516599 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516693 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516780 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.516870 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:49Z","lastTransitionTime":"2026-03-13T20:29:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.617147 4790 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658795 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658796 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658861 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.658870 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659412 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659593 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659687 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.659778 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.673479 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.687425 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.699358 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.713706 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.728738 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: E0313 20:29:49.740700 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.745544 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.760985 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.773547 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.784967 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.801585 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.821079 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.832452 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.843000 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.858358 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.868558 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.879029 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.889477 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.900987 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:49 crc kubenswrapper[4790]: I0313 20:29:49.912470 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:49Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.234939 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.234980 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.234988 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.235003 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.235012 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.249537 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254184 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254263 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254283 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.254348 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.275152 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278846 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278908 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278924 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278944 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.278956 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.292353 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.295943 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296022 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296042 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296110 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.296131 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.313011 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316604 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316661 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316700 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316738 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.316763 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:29:51Z","lastTransitionTime":"2026-03-13T20:29:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.331633 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:51Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:51Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.331750 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.659753 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.659841 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.659909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660063 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:51 crc kubenswrapper[4790]: I0313 20:29:51.660314 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660423 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660613 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:51 crc kubenswrapper[4790]: E0313 20:29:51.660741 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659274 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659422 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:53 crc kubenswrapper[4790]: I0313 20:29:53.659502 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659495 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659689 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659826 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:53 crc kubenswrapper[4790]: E0313 20:29:53.659990 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:54 crc kubenswrapper[4790]: E0313 20:29:54.742421 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.659496 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.659730 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.660062 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.660117 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660150 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660211 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660337 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.660548 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:55 crc kubenswrapper[4790]: I0313 20:29:55.660835 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:29:55 crc kubenswrapper[4790]: E0313 20:29:55.661098 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659250 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659286 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659250 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659423 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659455 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659518 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:57 crc kubenswrapper[4790]: I0313 20:29:57.659563 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:57 crc kubenswrapper[4790]: E0313 20:29:57.659877 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659248 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.659398 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659494 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659548 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.659900 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.659886 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.659993 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.660063 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.681049 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.695887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.712652 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.730296 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: E0313 20:29:59.742928 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.745995 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.757357 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.769236 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.781813 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.793238 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.823672 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.836262 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.847170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.865205 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.877825 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.891735 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.904631 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.920042 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.933775 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:29:59 crc kubenswrapper[4790]: I0313 20:29:59.946280 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:29:59Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344201 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/0.log" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344252 4790 generic.go:334] "Generic (PLEG): container finished" podID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" containerID="fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139" exitCode=1 Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerDied","Data":"fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139"} Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.344620 4790 scope.go:117] "RemoveContainer" containerID="fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.359966 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.378340 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.402396 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.423301 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.445292 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.456267 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.466154 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.477144 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.487597 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.498895 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.511790 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.526154 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.540011 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.552274 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.565056 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.578671 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.594454 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.607115 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.622466 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.661873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.662205 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.662650 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662348 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.662244 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662759 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662887 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.662930 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675779 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675834 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675847 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675863 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.675875 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.687979 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.691797 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.691919 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.692002 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.692094 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.692167 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.705590 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708635 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708749 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708843 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708933 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.708996 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.719702 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722298 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722440 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722544 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722621 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.722699 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.734307 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738172 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738273 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738358 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738464 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:01 crc kubenswrapper[4790]: I0313 20:30:01.738540 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:01Z","lastTransitionTime":"2026-03-13T20:30:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.750024 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:01Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:01 crc kubenswrapper[4790]: E0313 20:30:01.750415 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.349166 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/0.log" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.349234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e"} Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.363663 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.380602 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.397246 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.410657 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.422548 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.433574 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.462620 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.476723 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.488442 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.507013 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.523006 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.534771 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.543225 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.552846 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.563094 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.572965 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.583805 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.595733 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.610279 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.804895 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.816152 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.837563 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.852737 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.872887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.887861 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.901550 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.915332 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.926188 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.940734 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.951794 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.964164 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.975159 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:02 crc kubenswrapper[4790]: I0313 20:30:02.988457 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.001114 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:02Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.019538 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.054673 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.067998 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.082554 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.098204 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:03Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.659930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.660022 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660087 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660177 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.660271 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660604 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:03 crc kubenswrapper[4790]: I0313 20:30:03.660929 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:03 crc kubenswrapper[4790]: E0313 20:30:03.660997 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:04 crc kubenswrapper[4790]: E0313 20:30:04.744216 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659272 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.659887 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659348 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.659983 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659459 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.660058 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:05 crc kubenswrapper[4790]: I0313 20:30:05.659341 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:05 crc kubenswrapper[4790]: E0313 20:30:05.660125 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659753 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659868 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.659924 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:07 crc kubenswrapper[4790]: I0313 20:30:07.659868 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.660093 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.660173 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:07 crc kubenswrapper[4790]: E0313 20:30:07.660354 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:08 crc kubenswrapper[4790]: I0313 20:30:08.659735 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.370323 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.373109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.373695 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.389549 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.403038 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.416542 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.430815 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.444709 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.458832 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.481118 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.501129 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.517528 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.537038 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.551682 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.564698 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.577594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.595345 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.604646 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.614207 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.628499 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.639709 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.652781 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659064 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659221 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659239 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659308 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.659352 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659427 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.659477 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.680519 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.694495 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.705786 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.723684 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: E0313 20:30:09.744721 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.747270 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.761626 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.771068 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.780097 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.789819 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.798245 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.808523 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.821594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.833303 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.848495 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.863602 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.877258 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.893594 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.907479 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:09 crc kubenswrapper[4790]: I0313 20:30:09.917582 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:09Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.379067 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.379769 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/2.log" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.382518 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" exitCode=1 Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.382602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.382692 4790 scope.go:117] "RemoveContainer" containerID="921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.383956 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:10 crc kubenswrapper[4790]: E0313 20:30:10.384359 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.397315 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.411055 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.423887 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.437336 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.451151 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.467154 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.481199 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.495284 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.512926 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.527692 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.545370 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.558656 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.568612 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.583168 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.605128 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.620231 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.634592 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.652179 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://921527a6f6de7d69504130c91a6a14db2b0fce55a0fc944e5b7457a38ada3060\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:29:39Z\\\",\\\"message\\\":\\\"ble:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-dns/dns-default]} name:Service_openshift-dns/dns-default_UDP_node_router+switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[udp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.10:53:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4c1be812-05d3-4f45-91b5-a853a5c8de71}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0313 20:29:39.495482 7016 services_controller.go:451] Built service openshift-operator-lifecycle-manager/packageserver-service cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-operator-lifecycle-manager/packageserver-service_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-operator-lifecycle-manager/packageserver-service\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.153\\\\\\\", Port:5443, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Rou\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:10 crc kubenswrapper[4790]: I0313 20:30:10.669170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:10Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.391730 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.398129 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.398459 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.416729 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.433713 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.451305 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.465144 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.478170 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.490546 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.515180 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.529586 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.542368 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.563207 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.577167 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.587164 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.597589 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.607593 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.618558 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.630557 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.642016 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.654471 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659265 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659325 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659369 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.659334 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659502 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.659567 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.669541 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757374 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757445 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757457 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757473 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.757485 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.773637 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777628 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777659 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777668 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777684 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.777693 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.794903 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799013 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799072 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799089 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799106 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.799120 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.812152 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815669 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815712 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815760 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.815803 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.834138 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838239 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838316 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838330 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838353 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:11 crc kubenswrapper[4790]: I0313 20:30:11.838367 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:11Z","lastTransitionTime":"2026-03-13T20:30:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.851527 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:11Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:11 crc kubenswrapper[4790]: E0313 20:30:11.851707 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659503 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659572 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.659769 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:13 crc kubenswrapper[4790]: I0313 20:30:13.659788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.659956 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.660072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:13 crc kubenswrapper[4790]: E0313 20:30:13.660180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:14 crc kubenswrapper[4790]: E0313 20:30:14.746437 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.658958 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.659591 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.659792 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.659872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:15 crc kubenswrapper[4790]: I0313 20:30:15.659806 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.660028 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.660092 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:15 crc kubenswrapper[4790]: E0313 20:30:15.660551 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.605644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.605884 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.605856817 +0000 UTC m=+212.626972708 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606331 4790 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.606345 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606395 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.606363442 +0000 UTC m=+212.627479333 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606457 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606534 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606545 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606567 4790 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606586 4790 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606581 4790 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606648 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.606628229 +0000 UTC m=+212.627744190 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606699 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.60666882 +0000 UTC m=+212.627784811 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606586 4790 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.606783 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.606761753 +0000 UTC m=+212.627877784 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.659861 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.659878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.660111 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660240 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.660266 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660507 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660602 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.660680 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:17 crc kubenswrapper[4790]: I0313 20:30:17.707462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.707753 4790 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:30:17 crc kubenswrapper[4790]: E0313 20:30:17.707850 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs podName:c54336a0-5a12-4bf9-9807-337dd352fdb6 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.707829163 +0000 UTC m=+212.728945054 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs") pod "network-metrics-daemon-mnf26" (UID: "c54336a0-5a12-4bf9-9807-337dd352fdb6") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659271 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659339 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659434 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659527 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659546 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.659582 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659658 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.659739 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.679904 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.693216 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.705059 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.724328 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.739054 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: E0313 20:30:19.746988 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.749843 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.760957 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.772555 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.784391 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.793493 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.803769 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.815736 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.830148 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.843210 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.859286 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.870486 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.880285 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.891653 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:19 crc kubenswrapper[4790]: I0313 20:30:19.901793 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:19Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659792 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659874 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659917 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:21 crc kubenswrapper[4790]: I0313 20:30:21.659827 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.659957 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.660242 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.660363 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:21 crc kubenswrapper[4790]: E0313 20:30:21.660443 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243720 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243750 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243759 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243772 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.243780 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.257405 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261208 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261246 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261258 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.261267 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.278314 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282315 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282354 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282365 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282397 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.282407 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.294714 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298465 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298518 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298531 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298551 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.298563 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.314590 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318740 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318778 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318786 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318804 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:22 crc kubenswrapper[4790]: I0313 20:30:22.318845 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:22Z","lastTransitionTime":"2026-03-13T20:30:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.330536 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:22Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:22Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:22 crc kubenswrapper[4790]: E0313 20:30:22.330651 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659251 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659116 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659306 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:23 crc kubenswrapper[4790]: I0313 20:30:23.659320 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659399 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659456 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:23 crc kubenswrapper[4790]: E0313 20:30:23.659652 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:24 crc kubenswrapper[4790]: I0313 20:30:24.660333 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:24 crc kubenswrapper[4790]: E0313 20:30:24.660828 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:24 crc kubenswrapper[4790]: E0313 20:30:24.748453 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659125 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659193 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659207 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:25 crc kubenswrapper[4790]: I0313 20:30:25.659278 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659284 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659402 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659563 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:25 crc kubenswrapper[4790]: E0313 20:30:25.659715 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659536 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659549 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659604 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659673 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:27 crc kubenswrapper[4790]: I0313 20:30:27.659488 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:27 crc kubenswrapper[4790]: E0313 20:30:27.659779 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659152 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659506 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.659594 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659708 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659815 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.659888 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.675448 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-x2tjg" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:01Z\\\",\\\"message\\\":\\\"2026-03-13T20:29:15+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43\\\\n2026-03-13T20:29:15+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ceb2e840-b163-40af-ad91-61ca57c1ca43 to /host/opt/cni/bin/\\\\n2026-03-13T20:29:16Z [verbose] multus-daemon started\\\\n2026-03-13T20:29:16Z [verbose] Readiness Indicator file check\\\\n2026-03-13T20:30:01Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:30:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-pmkvj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-x2tjg\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.689130 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.702074 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.716157 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.730782 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3c0fd02689d39599997373d58d14a623c083cc933ba9d6effbeba9a722c33159\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.742500 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58464a30-7f56-4e13-894e-e53498a85637\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0e796c199087aedc4fceb772e39310fcbec6349316b04d40b15c80c5e349717\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h2vjb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-drtsx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: E0313 20:30:29.748988 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.756489 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-9tpww" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"05405fad-1758-412e-b3ab-9714a604b207\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2aae2b8ef737279e556fa66ebfd571d822b225e4cdc266d27bb090cbd3901f87\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7thmq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-9tpww\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.772229 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4da2be5-d947-41bd-b381-0b9eae10293d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:52Z\\\",\\\"message\\\":\\\"le observer\\\\nW0313 20:28:52.165524 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0313 20:28:52.165654 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0313 20:28:52.166349 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1914040246/tls.crt::/tmp/serving-cert-1914040246/tls.key\\\\\\\"\\\\nI0313 20:28:52.395548 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0313 20:28:52.397238 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0313 20:28:52.397262 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0313 20:28:52.397283 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0313 20:28:52.397295 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0313 20:28:52.403147 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0313 20:28:52.403211 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0313 20:28:52.403206 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0313 20:28:52.403222 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0313 20:28:52.403235 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0313 20:28:52.403243 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0313 20:28:52.403249 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0313 20:28:52.403272 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0313 20:28:52.403571 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:28:51Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.787209 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"457c31c8-9473-4e0b-b381-08c8223f5299\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dc949828fa60f8feba0f66c8d0cb607645a5aafd38b414d0649dd99f91a3b34\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5e8a51cd9797e3dbedf8e06ca42611deb089db49db8160de2ad63dee9ae95b84\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-13T20:28:16Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0313 20:27:51.774259 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0313 20:27:51.776118 1 observer_polling.go:159] Starting file observer\\\\nI0313 20:27:51.802542 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0313 20:27:51.806119 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0313 20:28:16.371103 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0313 20:28:16.371210 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:28:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3398c67b6041fb99eb8836ca662c339eb0ae03120568770b6f1ad094f61c3fe5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a24f168ec02dd5b574231b4f6400627d24009d80495fe51c56c3679d67a3f0b0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.801350 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://77505fb6deb478ffc3cf5c2fc0e2dd210ecb5f52a92527f72e0d74ead318e42f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fa996f28657eb013bec402012bcdb402eae90c08437b39068939000372c9f13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.828858 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-13T20:30:09Z\\\",\\\"message\\\":\\\".go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0313 20:30:09.493863 7376 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0313 20:30:09.493873 7376 handler.go:208] Removed *v1.Node event handler 2\\\\nI0313 20:30:09.493894 7376 handler.go:208] Removed *v1.Node event handler 7\\\\nI0313 20:30:09.493947 7376 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494023 7376 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0313 20:30:09.494069 7376 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0313 20:30:09.494075 7376 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0313 20:30:09.494107 7376 factory.go:656] Stopping watch factory\\\\nI0313 20:30:09.494117 7376 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0313 20:30:09.494131 7376 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0313 20:30:09.494152 7376 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0313 20:30:09.494509 7376 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0313 20:30:09.494643 7376 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0313 20:30:09.494714 7376 ovnkube.go:599] Stopped ovnkube\\\\nI0313 20:30:09.494790 7376 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0313 20:30:09.494922 7376 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-13T20:30:08Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h24bv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-gz4fj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.850171 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96d699b6-dfba-4b76-b3e8-0480527aa386\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://312f78349c58c8eaedb9c37e40ebdbee359e5e9422bee44b79ace1580d6b58b5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cec6b03701b33c20f7758a79ec7a4aaa9a230d285de86168d6c13f100097be09\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38d76a68ec1b1ea13e6a635aeb68f47a9d72c4009ebd5f58ccb32b3783a90696\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5703c4c0ddfa5a72777b57524e4127fc2994b9a401d8533b931ddb9942c19375\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5480ebb8c812b77539cc243bfea6f42ac69964dd066e3843abc1cccb44303460\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28ba563b9bcdf84d09136e26e8bd0bd578c1ee26bfbf0a70354258f370f3a283\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2badb3bba896e48a9976cc6722a18094e472d84d77df1be4f868347b760062b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:29:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:29:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h76wc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wq8kp\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.872135 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"048f95b7-a7de-4096-90a7-c9a0e2c68f18\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://120b7f74f49c3cb2b06ad82a2c040e69fa133f222638968b5a12bd56c6e491c4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://677e5550a48ab3a9d71e6a78525a7f98181cb83ce8624a9fced9c630cf0366f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bd968df9ea34ca509bfd1295f67115ab24d9336488562a771a94d005f85cdc21\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c549ca4fb0ba0660d98bec8f1ea7f45b93f122a06803411fabc64936c4d7e60a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d27e15e3c4cd1b46a591cde0fbf16c5dad0be0cd988f9aa47b22b1a38df785da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4e179aed20f29962ae9870d4d15377d9d147145f21cd9eed6f018432d4189fe1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ef8342074fe86f2ae245d4f14d80fca98afe6ea973f3998e8530b8ef636fbe8d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0e654bc06fc71e5bb56e4b2b8605aa6d000bd7f25601ca71019ce8824fb12364\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.886016 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f01fe9fa37be715ece0e35f5c2666a2317bd851f9df9a8fb32552a1550702e80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.895091 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-x4d2p" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"58c65c62-097b-4179-9ada-1627afa9fef2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e91cf021c86596f4775f74624b2c1e7f82013770916e1aa775393dc6a2e6591e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w6x7z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-x4d2p\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.906497 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2f8e0711-7595-4580-b702-558512c33395\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cff34bf57b328f17c8fee501fafd7ba88abcb6a8429f34480e42296db22a2b36\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c69d69402b6131e88e8110261d8c5eba9413e3850f30194fd29d068fd96669ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:29:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fq7qg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-lgs75\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.916808 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mnf26" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c54336a0-5a12-4bf9-9807-337dd352fdb6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:29:13Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7g48h\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:29:13Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mnf26\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.927415 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fee35d2c-dae5-419f-880c-c4a9920b5003\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:28:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d6781d1226a8dda62e82876a63a48b134482565484786206cd0104f4d49938\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5683f9ef291f82723019a5396e713ba263fca23a50d919c9243fbf1f61329312\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed062fb6bf175910ae6a81d81ba62f653a9719a53eecf0ddd8d14804babe5fe6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72aa968fcc5568f618b0ddb596fa0473f347d51eb164ae6b209e483f0cc633f5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:29 crc kubenswrapper[4790]: I0313 20:30:29.936663 4790 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a017383-dba5-4131-b6d9-c4a583290c79\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-13T20:27:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e951b86b6fae6bdf73b22ad634ca8f18e590e1c759b35f95a53ed5f09faece98\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T20:27:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://da9ac6f78445eca696ca81ccc4d2384ba350f2ce01fd0095de9f560ecb976cf7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-13T20:27:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-13T20:27:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-13T20:27:49Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:29Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.659873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.659932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.659894 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660023 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:31 crc kubenswrapper[4790]: I0313 20:30:31.660212 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660215 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660273 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:31 crc kubenswrapper[4790]: E0313 20:30:31.660328 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498192 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498229 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498237 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498251 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.498261 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.510912 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516130 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516196 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516220 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516252 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.516274 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.531321 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536253 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536416 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536459 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536491 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.536508 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.556143 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560004 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560057 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560068 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560085 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.560096 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.576757 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580462 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580633 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580731 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580817 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:32 crc kubenswrapper[4790]: I0313 20:30:32.580937 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:32Z","lastTransitionTime":"2026-03-13T20:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.595295 4790 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404548Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865348Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-13T20:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"ddb77a45-6df3-4ccf-8361-682222076454\\\",\\\"systemUUID\\\":\\\"e656ddb5-8fa2-4c70-bd3f-f718d29b7550\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-13T20:30:32Z is after 2025-08-24T17:21:41Z" Mar 13 20:30:32 crc kubenswrapper[4790]: E0313 20:30:32.595596 4790 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659621 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659840 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:33 crc kubenswrapper[4790]: I0313 20:30:33.659930 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.660569 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.660593 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.660821 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:33 crc kubenswrapper[4790]: E0313 20:30:33.661000 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:34 crc kubenswrapper[4790]: E0313 20:30:34.750824 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659276 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659498 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659520 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:35 crc kubenswrapper[4790]: I0313 20:30:35.659600 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659681 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659783 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659835 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:35 crc kubenswrapper[4790]: E0313 20:30:35.659905 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.659127 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.659349 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.659425 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.659460 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.660354 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.660630 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-gz4fj_openshift-ovn-kubernetes(a0c9dff4-5508-4391-bb03-6710c2b9f3b5)\"" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" Mar 13 20:30:37 crc kubenswrapper[4790]: I0313 20:30:37.660892 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.661040 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.661185 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:37 crc kubenswrapper[4790]: E0313 20:30:37.661294 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659692 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659737 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659876 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.659893 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.659916 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.660261 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.660422 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.660520 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.696706 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-x2tjg" podStartSLOduration=124.696679534 podStartE2EDuration="2m4.696679534s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.695122069 +0000 UTC m=+170.716237970" watchObservedRunningTime="2026-03-13 20:30:39.696679534 +0000 UTC m=+170.717795465" Mar 13 20:30:39 crc kubenswrapper[4790]: E0313 20:30:39.751609 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.787725 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podStartSLOduration=125.787700736 podStartE2EDuration="2m5.787700736s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.777844575 +0000 UTC m=+170.798960476" watchObservedRunningTime="2026-03-13 20:30:39.787700736 +0000 UTC m=+170.808816627" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.787972 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-9tpww" podStartSLOduration=125.787966584 podStartE2EDuration="2m5.787966584s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.78783234 +0000 UTC m=+170.808948231" watchObservedRunningTime="2026-03-13 20:30:39.787966584 +0000 UTC m=+170.809082475" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.801564 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=72.801542002 podStartE2EDuration="1m12.801542002s" podCreationTimestamp="2026-03-13 20:29:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.801212453 +0000 UTC m=+170.822328344" watchObservedRunningTime="2026-03-13 20:30:39.801542002 +0000 UTC m=+170.822657903" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.813551 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=57.813534205 podStartE2EDuration="57.813534205s" podCreationTimestamp="2026-03-13 20:29:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.813226996 +0000 UTC m=+170.834342887" watchObservedRunningTime="2026-03-13 20:30:39.813534205 +0000 UTC m=+170.834650096" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.918936 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wq8kp" podStartSLOduration=124.918918468 podStartE2EDuration="2m4.918918468s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.890534347 +0000 UTC m=+170.911650238" watchObservedRunningTime="2026-03-13 20:30:39.918918468 +0000 UTC m=+170.940034359" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.935546 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=64.935527863 podStartE2EDuration="1m4.935527863s" podCreationTimestamp="2026-03-13 20:29:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.921024979 +0000 UTC m=+170.942140870" watchObservedRunningTime="2026-03-13 20:30:39.935527863 +0000 UTC m=+170.956643754" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.960716 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-x4d2p" podStartSLOduration=125.960694263 podStartE2EDuration="2m5.960694263s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.947761453 +0000 UTC m=+170.968877344" watchObservedRunningTime="2026-03-13 20:30:39.960694263 +0000 UTC m=+170.981810154" Mar 13 20:30:39 crc kubenswrapper[4790]: I0313 20:30:39.975476 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-lgs75" podStartSLOduration=124.975440084 podStartE2EDuration="2m4.975440084s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.962474673 +0000 UTC m=+170.983590564" watchObservedRunningTime="2026-03-13 20:30:39.975440084 +0000 UTC m=+170.996555975" Mar 13 20:30:40 crc kubenswrapper[4790]: I0313 20:30:40.003291 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=85.00326957 podStartE2EDuration="1m25.00326957s" podCreationTimestamp="2026-03-13 20:29:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:40.002527829 +0000 UTC m=+171.023643720" watchObservedRunningTime="2026-03-13 20:30:40.00326957 +0000 UTC m=+171.024385461" Mar 13 20:30:40 crc kubenswrapper[4790]: I0313 20:30:40.003801 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=56.003794115 podStartE2EDuration="56.003794115s" podCreationTimestamp="2026-03-13 20:29:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:39.990102473 +0000 UTC m=+171.011218364" watchObservedRunningTime="2026-03-13 20:30:40.003794115 +0000 UTC m=+171.024910006" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659127 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659183 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659354 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659466 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:41 crc kubenswrapper[4790]: I0313 20:30:41.659653 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659678 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659815 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:41 crc kubenswrapper[4790]: E0313 20:30:41.659935 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733615 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733664 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733675 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733691 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.733703 4790 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-13T20:30:42Z","lastTransitionTime":"2026-03-13T20:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.793198 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4"] Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.794263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.806767 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.806872 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.807083 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.807139 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976021 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6203dbf-1e64-41e0-9a73-26def8967139-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976142 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6203dbf-1e64-41e0-9a73-26def8967139-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976164 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:42 crc kubenswrapper[4790]: I0313 20:30:42.976182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6203dbf-1e64-41e0-9a73-26def8967139-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077568 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6203dbf-1e64-41e0-9a73-26def8967139-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077649 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077711 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077907 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6203dbf-1e64-41e0-9a73-26def8967139-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.077978 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.078011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6203dbf-1e64-41e0-9a73-26def8967139-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.078123 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f6203dbf-1e64-41e0-9a73-26def8967139-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.079237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f6203dbf-1e64-41e0-9a73-26def8967139-service-ca\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.084370 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6203dbf-1e64-41e0-9a73-26def8967139-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.104270 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f6203dbf-1e64-41e0-9a73-26def8967139-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-xktb4\" (UID: \"f6203dbf-1e64-41e0-9a73-26def8967139\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.121088 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.511269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" event={"ID":"f6203dbf-1e64-41e0-9a73-26def8967139","Type":"ContainerStarted","Data":"d959332980388be3c8e4e5623e9d1477ffca6e76a1a41023a6414e8e009fde45"} Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.511356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" event={"ID":"f6203dbf-1e64-41e0-9a73-26def8967139","Type":"ContainerStarted","Data":"1cfae9b5abcdcffb5916c54c396d3f19b07737986f58ab4f908d43596166f4f3"} Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.527704 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-xktb4" podStartSLOduration=129.527687154 podStartE2EDuration="2m9.527687154s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:43.525310156 +0000 UTC m=+174.546426047" watchObservedRunningTime="2026-03-13 20:30:43.527687154 +0000 UTC m=+174.548803045" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660573 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660663 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660677 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660743 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660774 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660826 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.660854 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:43 crc kubenswrapper[4790]: E0313 20:30:43.660923 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.700727 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 20:30:43 crc kubenswrapper[4790]: I0313 20:30:43.711250 4790 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 20:30:44 crc kubenswrapper[4790]: E0313 20:30:44.753004 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659097 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:45 crc kubenswrapper[4790]: I0313 20:30:45.659136 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659523 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659568 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659634 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:45 crc kubenswrapper[4790]: E0313 20:30:45.659689 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.524942 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525469 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/0.log" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525519 4790 generic.go:334] "Generic (PLEG): container finished" podID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" exitCode=1 Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerDied","Data":"9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e"} Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.525585 4790 scope.go:117] "RemoveContainer" containerID="fe063aad165db72c08f152d67592cb7f9aaf0b6413eb65ac47e79ee322b36139" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.526009 4790 scope.go:117] "RemoveContainer" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.526298 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-x2tjg_openshift-multus(207e7f49-094a-4e59-a8ff-9eacd8d6fe2a)\"" pod="openshift-multus/multus-x2tjg" podUID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659081 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.659208 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:47 crc kubenswrapper[4790]: I0313 20:30:47.659096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.659509 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.659950 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:47 crc kubenswrapper[4790]: E0313 20:30:47.660053 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:48 crc kubenswrapper[4790]: I0313 20:30:48.530783 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.659011 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.659069 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.659112 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660003 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:49 crc kubenswrapper[4790]: I0313 20:30:49.660038 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660223 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660309 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.660505 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:49 crc kubenswrapper[4790]: E0313 20:30:49.753409 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659208 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659172 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659320 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659492 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:51 crc kubenswrapper[4790]: I0313 20:30:51.659648 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659726 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:51 crc kubenswrapper[4790]: E0313 20:30:51.659932 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:52 crc kubenswrapper[4790]: I0313 20:30:52.660253 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.480651 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mnf26"] Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.481196 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.481413 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.550634 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.553365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerStarted","Data":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.554288 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.660567 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.660796 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.661056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.661132 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:53 crc kubenswrapper[4790]: I0313 20:30:53.661928 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:53 crc kubenswrapper[4790]: E0313 20:30:53.661992 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:54 crc kubenswrapper[4790]: E0313 20:30:54.754466 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659638 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.659765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659778 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659829 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.659946 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.660100 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:55 crc kubenswrapper[4790]: I0313 20:30:55.659695 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:55 crc kubenswrapper[4790]: E0313 20:30:55.661062 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659452 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659522 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659618 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:57 crc kubenswrapper[4790]: I0313 20:30:57.659642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659738 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659836 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:57 crc kubenswrapper[4790]: E0313 20:30:57.659893 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658814 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658865 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.658891 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660354 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660460 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660507 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.660599 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.660680 4790 scope.go:117] "RemoveContainer" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" Mar 13 20:30:59 crc kubenswrapper[4790]: I0313 20:30:59.683900 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podStartSLOduration=144.683871778 podStartE2EDuration="2m24.683871778s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:30:53.589080811 +0000 UTC m=+184.610196742" watchObservedRunningTime="2026-03-13 20:30:59.683871778 +0000 UTC m=+190.704987709" Mar 13 20:30:59 crc kubenswrapper[4790]: E0313 20:30:59.755041 4790 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 20:31:00 crc kubenswrapper[4790]: I0313 20:31:00.577171 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:31:00 crc kubenswrapper[4790]: I0313 20:31:00.577238 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221"} Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659570 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659615 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:01 crc kubenswrapper[4790]: I0313 20:31:01.659873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.659955 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.660078 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.660122 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:31:01 crc kubenswrapper[4790]: E0313 20:31:01.660178 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659326 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659442 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659489 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mnf26" podUID="c54336a0-5a12-4bf9-9807-337dd352fdb6" Mar 13 20:31:03 crc kubenswrapper[4790]: I0313 20:31:03.659505 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659673 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659815 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 13 20:31:03 crc kubenswrapper[4790]: E0313 20:31:03.659892 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659900 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659934 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.659991 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.663879 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.664041 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.664115 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.664611 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.665034 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 20:31:05 crc kubenswrapper[4790]: I0313 20:31:05.665356 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.546765 4790 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.586724 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x7zgr"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.587319 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.587487 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.587692 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.588312 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.588956 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.589353 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.589711 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.590360 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.590802 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.591466 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.596477 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.596667 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.603084 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmlmp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.603834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.606624 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.606872 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.607185 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.607657 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.608594 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.608628 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609442 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609635 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609884 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.609981 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.613738 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.613957 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614085 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614223 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614240 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614759 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614300 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614370 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614464 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614525 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.614574 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615225 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615329 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615365 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615438 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615514 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615439 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615522 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615732 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615762 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615796 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615918 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.615991 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.616005 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.617862 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618074 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618243 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618421 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618644 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.618825 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.619848 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.620173 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.620312 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.620483 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.622535 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.622861 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623025 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623178 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623321 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623531 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.623678 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.626848 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.627367 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.627927 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zws8z"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.628291 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.630052 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jfdgz"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.630708 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.632313 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.632417 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.632314 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-zfhhl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.633740 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.634589 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.634962 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.635809 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.636156 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.636514 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.636841 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.640814 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.641656 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.649588 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.650130 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688276 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk6nf\" (UniqueName: \"kubernetes.io/projected/c5db072c-5e1d-4149-99c8-aee1209189ba-kube-api-access-nk6nf\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688299 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-encryption-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688343 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d847db-0b8e-4128-af43-a17fe76b77d9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688366 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688403 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688504 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-serving-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688582 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688629 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688652 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688702 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99d847db-0b8e-4128-af43-a17fe76b77d9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688726 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688745 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljppd\" (UniqueName: \"kubernetes.io/projected/4b88ca59-d36e-4682-99e1-10ef4fa85e10-kube-api-access-ljppd\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688763 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688806 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688844 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688864 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db072c-5e1d-4149-99c8-aee1209189ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-node-pullsecrets\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-image-import-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688928 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-client\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit-dir\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688967 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.688998 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lj8kr\" (UniqueName: \"kubernetes.io/projected/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-kube-api-access-lj8kr\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689020 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad15b399-2051-480d-8389-f58f94c10d81-config\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwgb9\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-kube-api-access-fwgb9\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689108 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689140 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkjj2\" (UniqueName: \"kubernetes.io/projected/44748a56-ff71-45b3-a67a-34d5bf7ae56b-kube-api-access-dkjj2\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad15b399-2051-480d-8389-f58f94c10d81-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689178 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689200 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689218 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44748a56-ff71-45b3-a67a-34d5bf7ae56b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689277 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4b88ca59-d36e-4682-99e1-10ef4fa85e10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689296 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad15b399-2051-480d-8389-f58f94c10d81-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689319 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg4zt\" (UniqueName: \"kubernetes.io/projected/b8f95d7e-96c6-475c-8bef-d72937cc36b4-kube-api-access-qg4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689338 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b88ca59-d36e-4682-99e1-10ef4fa85e10-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689357 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-serving-cert\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5db072c-5e1d-4149-99c8-aee1209189ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8f95d7e-96c6-475c-8bef-d72937cc36b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.689934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.707431 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jtczv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.708101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.710868 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.713528 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.714664 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.715763 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.715818 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.715925 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.716087 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.716210 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.717946 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.718193 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.718826 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.719225 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.719646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.726101 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.726591 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.726844 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kfl48"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.727068 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.728134 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743003 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743050 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743355 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743638 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.743964 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744298 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744503 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744540 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744686 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744783 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.744892 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.745369 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.747629 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.748722 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.755802 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.755969 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.755809 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756501 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756596 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756815 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.756623 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.757931 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.758988 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.759409 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.759747 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.760002 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.760016 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.760279 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.761105 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.761746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.767325 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.771067 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.771194 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.771844 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.772513 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.774316 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x7zgr"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.775728 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.776126 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.776519 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.777943 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.781779 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.787546 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.787922 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.788166 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.789465 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.789588 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit-dir\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lj8kr\" (UniqueName: \"kubernetes.io/projected/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-kube-api-access-lj8kr\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad15b399-2051-480d-8389-f58f94c10d81-config\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwgb9\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-kube-api-access-fwgb9\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792260 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792328 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44748a56-ff71-45b3-a67a-34d5bf7ae56b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792343 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkjj2\" (UniqueName: \"kubernetes.io/projected/44748a56-ff71-45b3-a67a-34d5bf7ae56b-kube-api-access-dkjj2\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad15b399-2051-480d-8389-f58f94c10d81-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4b88ca59-d36e-4682-99e1-10ef4fa85e10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad15b399-2051-480d-8389-f58f94c10d81-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg4zt\" (UniqueName: \"kubernetes.io/projected/b8f95d7e-96c6-475c-8bef-d72937cc36b4-kube-api-access-qg4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b88ca59-d36e-4682-99e1-10ef4fa85e10-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792483 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-serving-cert\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5db072c-5e1d-4149-99c8-aee1209189ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792515 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8f95d7e-96c6-475c-8bef-d72937cc36b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792548 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk6nf\" (UniqueName: \"kubernetes.io/projected/c5db072c-5e1d-4149-99c8-aee1209189ba-kube-api-access-nk6nf\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792563 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-encryption-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d847db-0b8e-4128-af43-a17fe76b77d9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792662 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-serving-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792841 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792856 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99d847db-0b8e-4128-af43-a17fe76b77d9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792916 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljppd\" (UniqueName: \"kubernetes.io/projected/4b88ca59-d36e-4682-99e1-10ef4fa85e10-kube-api-access-ljppd\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.792997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793017 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db072c-5e1d-4149-99c8-aee1209189ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-node-pullsecrets\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-image-import-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.793107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-client\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.794278 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit-dir\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.795257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad15b399-2051-480d-8389-f58f94c10d81-config\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.796644 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.796873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.797543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/4b88ca59-d36e-4682-99e1-10ef4fa85e10-available-featuregates\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.797891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.799156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-audit\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.799424 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.799720 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.803151 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-serving-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.807721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/99d847db-0b8e-4128-af43-a17fe76b77d9-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.807883 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808058 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-trusted-ca-bundle\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808708 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b8f95d7e-96c6-475c-8bef-d72937cc36b4-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.808923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.809524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5db072c-5e1d-4149-99c8-aee1209189ba-config\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.809607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.810940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad15b399-2051-480d-8389-f58f94c10d81-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.810949 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.811035 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.811132 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-node-pullsecrets\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.811891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/44748a56-ff71-45b3-a67a-34d5bf7ae56b-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.812252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5db072c-5e1d-4149-99c8-aee1209189ba-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.813797 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b88ca59-d36e-4682-99e1-10ef4fa85e10-serving-cert\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.815616 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.816571 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.816696 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d847db-0b8e-4128-af43-a17fe76b77d9-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.817035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.817293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-encryption-config\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.817709 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-etcd-client\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.818542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-image-import-ca\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.820224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.820350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-serving-cert\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.820817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821060 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821465 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.821979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822211 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822454 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.822606 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.823281 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmlmp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.823309 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.823370 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.824192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.825828 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.827857 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.835410 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.835806 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.836453 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.844330 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.844557 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.853747 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.858910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.858816 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.859027 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-pzx4q"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.859877 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jfdgz"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.859958 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.861590 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vp24d"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.862438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.863196 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.865022 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.868577 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jtczv"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.870119 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.870807 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.870936 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.872186 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.873731 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.875600 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kfl48"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.876164 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.878477 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.878630 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.879763 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.881323 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.882235 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.882579 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.883918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zfhhl"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.885052 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.885563 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.886632 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.887664 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.888555 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.888706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.889777 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zgzvb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.890771 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.890984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.890771 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.892614 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.893044 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.893129 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.893550 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.894355 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zws8z"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.894443 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.895161 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vp24d"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.896111 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.897157 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.898262 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-vggp9"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.899033 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.899349 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.900895 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.901772 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.902771 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.903778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.905226 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.907535 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.908691 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.910330 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.910959 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.911044 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jw27w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.914856 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-zwfns"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.915316 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919203 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919232 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919249 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.919340 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.923634 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.924223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zgzvb"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.926793 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jw27w"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.930193 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.930406 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zwfns"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.931797 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cxj7h"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.932907 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.933419 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.934698 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cxj7h"] Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.949864 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.969904 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 20:31:13 crc kubenswrapper[4790]: I0313 20:31:13.989933 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.009928 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.015503 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.015551 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.029944 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.030703 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.050711 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.069888 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.090302 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.109946 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.130027 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.150529 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.169878 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.190780 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.209815 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.230449 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.250535 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.276285 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.344996 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ad15b399-2051-480d-8389-f58f94c10d81-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-bxb2l\" (UID: \"ad15b399-2051-480d-8389-f58f94c10d81\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.365169 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lj8kr\" (UniqueName: \"kubernetes.io/projected/1db4655f-49dd-48c8-a290-c3c4f2fb74ba-kube-api-access-lj8kr\") pod \"apiserver-76f77b778f-x7zgr\" (UID: \"1db4655f-49dd-48c8-a290-c3c4f2fb74ba\") " pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.384451 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwgb9\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-kube-api-access-fwgb9\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399441 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv9j5\" (UniqueName: \"kubernetes.io/projected/4fa77308-6519-4481-b87b-4a1b066bada3-kube-api-access-rv9j5\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-config\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399565 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n25p\" (UniqueName: \"kubernetes.io/projected/3635b091-f7bf-4c6d-bb7a-5723b36f990f-kube-api-access-5n25p\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071ab142-7ad6-43bc-aa6a-e6761ea33b15-serving-cert\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.399886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-encryption-config\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400200 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xc7h\" (UniqueName: \"kubernetes.io/projected/071ab142-7ad6-43bc-aa6a-e6761ea33b15-kube-api-access-6xc7h\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400337 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4fa77308-6519-4481-b87b-4a1b066bada3-machine-approver-tls\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400474 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-service-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-auth-proxy-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400665 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzpr9\" (UniqueName: \"kubernetes.io/projected/6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150-kube-api-access-hzpr9\") pod \"downloads-7954f5f757-zfhhl\" (UID: \"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150\") " pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-serving-cert\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400762 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-images\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-policies\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400846 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6b8\" (UniqueName: \"kubernetes.io/projected/a626166a-5d74-4dd9-b838-746731bfedef-kube-api-access-vw6b8\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-trusted-ca\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dr5wq\" (UniqueName: \"kubernetes.io/projected/94386d3d-038a-4e4d-9e97-fd04336847a0-kube-api-access-dr5wq\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.400997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a626166a-5d74-4dd9-b838-746731bfedef-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-dir\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401051 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-client\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-config\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401128 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94386d3d-038a-4e4d-9e97-fd04336847a0-serving-cert\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401299 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.401325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-config\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.401685 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:14.901634808 +0000 UTC m=+205.922750769 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.405159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkjj2\" (UniqueName: \"kubernetes.io/projected/44748a56-ff71-45b3-a67a-34d5bf7ae56b-kube-api-access-dkjj2\") pod \"cluster-samples-operator-665b6dd947-fhxvv\" (UID: \"44748a56-ff71-45b3-a67a-34d5bf7ae56b\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.424777 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"route-controller-manager-6576b87f9c-ftx7g\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.443899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljppd\" (UniqueName: \"kubernetes.io/projected/4b88ca59-d36e-4682-99e1-10ef4fa85e10-kube-api-access-ljppd\") pod \"openshift-config-operator-7777fb866f-7ql4r\" (UID: \"4b88ca59-d36e-4682-99e1-10ef4fa85e10\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.464933 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg4zt\" (UniqueName: \"kubernetes.io/projected/b8f95d7e-96c6-475c-8bef-d72937cc36b4-kube-api-access-qg4zt\") pod \"control-plane-machine-set-operator-78cbb6b69f-qr47g\" (UID: \"b8f95d7e-96c6-475c-8bef-d72937cc36b4\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.483420 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.484047 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk6nf\" (UniqueName: \"kubernetes.io/projected/c5db072c-5e1d-4149-99c8-aee1209189ba-kube-api-access-nk6nf\") pod \"openshift-apiserver-operator-796bbdcf4f-9l97v\" (UID: \"c5db072c-5e1d-4149-99c8-aee1209189ba\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.494027 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502643 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75sc9\" (UniqueName: \"kubernetes.io/projected/bcf10b74-f8ce-4748-a813-5aefe86f13f7-kube-api-access-75sc9\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.502664 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.002636105 +0000 UTC m=+206.023751996 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502709 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-proxy-tls\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502748 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-etcd-client\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502822 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502864 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d4b8de-5800-44a1-b2d9-338e4d267866-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.502927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af8dabc-a918-4188-8257-112b5f8d71d0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzpr9\" (UniqueName: \"kubernetes.io/projected/6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150-kube-api-access-hzpr9\") pod \"downloads-7954f5f757-zfhhl\" (UID: \"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150\") " pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503100 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-serving-cert\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503147 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-images\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503169 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-policies\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-srv-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbmtd\" (UniqueName: \"kubernetes.io/projected/4e8cc2ad-07fc-4d24-956e-94599d58be06-kube-api-access-qbmtd\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq5sl\" (UniqueName: \"kubernetes.io/projected/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-kube-api-access-hq5sl\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bc71397-bb77-45b3-92c4-77710458d4fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503580 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-webhook-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6b8\" (UniqueName: \"kubernetes.io/projected/a626166a-5d74-4dd9-b838-746731bfedef-kube-api-access-vw6b8\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-service-ca-bundle\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503710 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc71397-bb77-45b3-92c4-77710458d4fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503789 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dr5wq\" (UniqueName: \"kubernetes.io/projected/94386d3d-038a-4e4d-9e97-fd04336847a0-kube-api-access-dr5wq\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.503837 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-policies\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-images\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504144 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwxj6\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-kube-api-access-kwxj6\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504414 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm22j\" (UniqueName: \"kubernetes.io/projected/5f9c2f7c-9058-4ad2-84a2-037d212792ad-kube-api-access-bm22j\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-dir\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8rjf\" (UniqueName: \"kubernetes.io/projected/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-kube-api-access-b8rjf\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504528 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-client\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-tmpfs\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-csi-data-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpwnr\" (UniqueName: \"kubernetes.io/projected/9e6c6344-8059-43d7-97be-273d115b8471-kube-api-access-gpwnr\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-default-certificate\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504713 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx89m\" (UniqueName: \"kubernetes.io/projected/929728d6-959b-4532-a9de-298aed7edb3f-kube-api-access-vx89m\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504738 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504757 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9vps\" (UniqueName: \"kubernetes.io/projected/4af8dabc-a918-4188-8257-112b5f8d71d0-kube-api-access-x9vps\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42ljl\" (UniqueName: \"kubernetes.io/projected/2f612fb7-c001-4a97-b17c-008bcf100be1-kube-api-access-42ljl\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-config\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jvkr\" (UniqueName: \"kubernetes.io/projected/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-kube-api-access-5jvkr\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504902 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504921 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504940 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071ab142-7ad6-43bc-aa6a-e6761ea33b15-serving-cert\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-encryption-config\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chlbb\" (UniqueName: \"kubernetes.io/projected/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-kube-api-access-chlbb\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.504993 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.505000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.505025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-metrics-certs\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.505090 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3635b091-f7bf-4c6d-bb7a-5723b36f990f-audit-dir\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.506048 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.006036327 +0000 UTC m=+206.027152288 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xc7h\" (UniqueName: \"kubernetes.io/projected/071ab142-7ad6-43bc-aa6a-e6761ea33b15-kube-api-access-6xc7h\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506291 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-config\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/929728d6-959b-4532-a9de-298aed7edb3f-signing-key\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-config\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sl24\" (UniqueName: \"kubernetes.io/projected/32d4b8de-5800-44a1-b2d9-338e4d267866-kube-api-access-8sl24\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506538 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506546 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506597 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-serving-cert\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506745 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-mountpoint-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.506913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-serving-cert\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507278 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507443 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507567 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-auth-proxy-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4fa77308-6519-4481-b87b-4a1b066bada3-machine-approver-tls\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507776 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-service-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507809 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4af8dabc-a918-4188-8257-112b5f8d71d0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.507946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8ttg\" (UniqueName: \"kubernetes.io/projected/4cfd91e9-ce88-4004-b936-551d50d26a7d-kube-api-access-p8ttg\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508043 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508066 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-images\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-etcd-client\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gbdr\" (UniqueName: \"kubernetes.io/projected/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-kube-api-access-9gbdr\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508196 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-socket-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4fa77308-6519-4481-b87b-4a1b066bada3-auth-proxy-config\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"controller-manager-879f6c89f-zsqd7\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509688 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-encryption-config\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.509845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.508293 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d4b8de-5800-44a1-b2d9-338e4d267866-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2b96\" (UniqueName: \"kubernetes.io/projected/979fe4d1-6e0f-4b07-b994-c183a200a1cc-kube-api-access-l2b96\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-config\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510417 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510431 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-trusted-ca\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510600 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f612fb7-c001-4a97-b17c-008bcf100be1-proxy-tls\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510633 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc71397-bb77-45b3-92c4-77710458d4fe-config\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510694 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31b24f51-5194-4af5-a171-bd55caaf8ded-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-service-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510854 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a626166a-5d74-4dd9-b838-746731bfedef-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510878 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4lb2\" (UniqueName: \"kubernetes.io/projected/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-kube-api-access-h4lb2\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5dd2\" (UniqueName: \"kubernetes.io/projected/631645f5-2f1a-41e7-ba2a-a665c827acb5-kube-api-access-t5dd2\") pod \"migrator-59844c95c7-qsg78\" (UID: \"631645f5-2f1a-41e7-ba2a-a665c827acb5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.510962 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/929728d6-959b-4532-a9de-298aed7edb3f-signing-cabundle\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511013 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8313e458-290f-42ba-8656-dc9dcf0e0b98-trusted-ca\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511160 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-config\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511286 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-srv-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511352 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94386d3d-038a-4e4d-9e97-fd04336847a0-serving-cert\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511370 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"auto-csr-approver-29557230-8pqh8\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-plugins-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511431 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-config\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511449 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmp7h\" (UniqueName: \"kubernetes.io/projected/31b24f51-5194-4af5-a171-bd55caaf8ded-kube-api-access-zmp7h\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t4tw\" (UniqueName: \"kubernetes.io/projected/21386249-439b-4454-828b-f9da9ecce958-kube-api-access-8t4tw\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv9j5\" (UniqueName: \"kubernetes.io/projected/4fa77308-6519-4481-b87b-4a1b066bada3-kube-api-access-rv9j5\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-stats-auth\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511578 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511612 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8313e458-290f-42ba-8656-dc9dcf0e0b98-metrics-tls\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511644 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n25p\" (UniqueName: \"kubernetes.io/projected/3635b091-f7bf-4c6d-bb7a-5723b36f990f-kube-api-access-5n25p\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511754 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511825 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511879 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e8cc2ad-07fc-4d24-956e-94599d58be06-metrics-tls\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-registration-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.512554 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-config\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.511540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-service-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.514054 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a626166a-5d74-4dd9-b838-746731bfedef-config\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.514206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/071ab142-7ad6-43bc-aa6a-e6761ea33b15-serving-cert\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.514822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4fa77308-6519-4481-b87b-4a1b066bada3-machine-approver-tls\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.515062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.515166 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3635b091-f7bf-4c6d-bb7a-5723b36f990f-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.517483 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94386d3d-038a-4e4d-9e97-fd04336847a0-serving-cert\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.517579 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/94386d3d-038a-4e4d-9e97-fd04336847a0-trusted-ca\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.520584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/071ab142-7ad6-43bc-aa6a-e6761ea33b15-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.520886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3635b091-f7bf-4c6d-bb7a-5723b36f990f-serving-cert\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.522101 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a626166a-5d74-4dd9-b838-746731bfedef-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.527669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.529036 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"oauth-openshift-558db77b4-szftl\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.534641 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.547897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/99d847db-0b8e-4128-af43-a17fe76b77d9-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-f5jjm\" (UID: \"99d847db-0b8e-4128-af43-a17fe76b77d9\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.550037 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.558656 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.571251 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.582659 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.590731 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.604670 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.611215 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.613957 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.614172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"auto-csr-approver-29557230-8pqh8\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.614517 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.614707 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.114681532 +0000 UTC m=+206.135797433 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615005 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-plugins-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmp7h\" (UniqueName: \"kubernetes.io/projected/31b24f51-5194-4af5-a171-bd55caaf8ded-kube-api-access-zmp7h\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615078 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8t4tw\" (UniqueName: \"kubernetes.io/projected/21386249-439b-4454-828b-f9da9ecce958-kube-api-access-8t4tw\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615102 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-stats-auth\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8313e458-290f-42ba-8656-dc9dcf0e0b98-metrics-tls\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615239 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615292 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-registration-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e8cc2ad-07fc-4d24-956e-94599d58be06-metrics-tls\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75sc9\" (UniqueName: \"kubernetes.io/projected/bcf10b74-f8ce-4748-a813-5aefe86f13f7-kube-api-access-75sc9\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615360 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-proxy-tls\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-etcd-client\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615437 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615470 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-plugins-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af8dabc-a918-4188-8257-112b5f8d71d0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d4b8de-5800-44a1-b2d9-338e4d267866-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-srv-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq5sl\" (UniqueName: \"kubernetes.io/projected/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-kube-api-access-hq5sl\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbmtd\" (UniqueName: \"kubernetes.io/projected/4e8cc2ad-07fc-4d24-956e-94599d58be06-kube-api-access-qbmtd\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-webhook-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bc71397-bb77-45b3-92c4-77710458d4fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-service-ca-bundle\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615831 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615861 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc71397-bb77-45b3-92c4-77710458d4fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615894 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwxj6\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-kube-api-access-kwxj6\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.615999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bm22j\" (UniqueName: \"kubernetes.io/projected/5f9c2f7c-9058-4ad2-84a2-037d212792ad-kube-api-access-bm22j\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616022 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8rjf\" (UniqueName: \"kubernetes.io/projected/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-kube-api-access-b8rjf\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616044 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616064 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-tmpfs\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-csi-data-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616106 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpwnr\" (UniqueName: \"kubernetes.io/projected/9e6c6344-8059-43d7-97be-273d115b8471-kube-api-access-gpwnr\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616131 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-default-certificate\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx89m\" (UniqueName: \"kubernetes.io/projected/929728d6-959b-4532-a9de-298aed7edb3f-kube-api-access-vx89m\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616234 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9vps\" (UniqueName: \"kubernetes.io/projected/4af8dabc-a918-4188-8257-112b5f8d71d0-kube-api-access-x9vps\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42ljl\" (UniqueName: \"kubernetes.io/projected/2f612fb7-c001-4a97-b17c-008bcf100be1-kube-api-access-42ljl\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jvkr\" (UniqueName: \"kubernetes.io/projected/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-kube-api-access-5jvkr\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616319 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616371 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616488 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chlbb\" (UniqueName: \"kubernetes.io/projected/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-kube-api-access-chlbb\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616521 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-metrics-certs\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-config\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616565 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/929728d6-959b-4532-a9de-298aed7edb3f-signing-key\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sl24\" (UniqueName: \"kubernetes.io/projected/32d4b8de-5800-44a1-b2d9-338e4d267866-kube-api-access-8sl24\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-serving-cert\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-mountpoint-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616683 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-serving-cert\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616756 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4af8dabc-a918-4188-8257-112b5f8d71d0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616830 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8ttg\" (UniqueName: \"kubernetes.io/projected/4cfd91e9-ce88-4004-b936-551d50d26a7d-kube-api-access-p8ttg\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-images\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616911 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616932 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gbdr\" (UniqueName: \"kubernetes.io/projected/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-kube-api-access-9gbdr\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.616977 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-socket-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d4b8de-5800-44a1-b2d9-338e4d267866-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617025 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2b96\" (UniqueName: \"kubernetes.io/projected/979fe4d1-6e0f-4b07-b994-c183a200a1cc-kube-api-access-l2b96\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617042 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-registration-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-config\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617078 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f612fb7-c001-4a97-b17c-008bcf100be1-proxy-tls\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc71397-bb77-45b3-92c4-77710458d4fe-config\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617127 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.617154 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31b24f51-5194-4af5-a171-bd55caaf8ded-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-service-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4lb2\" (UniqueName: \"kubernetes.io/projected/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-kube-api-access-h4lb2\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618202 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t5dd2\" (UniqueName: \"kubernetes.io/projected/631645f5-2f1a-41e7-ba2a-a665c827acb5-kube-api-access-t5dd2\") pod \"migrator-59844c95c7-qsg78\" (UID: \"631645f5-2f1a-41e7-ba2a-a665c827acb5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618224 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/929728d6-959b-4532-a9de-298aed7edb3f-signing-cabundle\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618251 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8313e458-290f-42ba-8656-dc9dcf0e0b98-trusted-ca\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618273 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618335 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-srv-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.618354 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.620546 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.621156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.621461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.621544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-socket-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.622628 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.622818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-config\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623195 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-csi-data-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-srv-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623436 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.623940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/9e6c6344-8059-43d7-97be-273d115b8471-mountpoint-dir\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.624850 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.624901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2f612fb7-c001-4a97-b17c-008bcf100be1-images\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.625351 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-etcd-client\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.625435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e8cc2ad-07fc-4d24-956e-94599d58be06-metrics-tls\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.625660 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.125645479 +0000 UTC m=+206.146761370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.626847 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-tmpfs\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.627553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/21386249-439b-4454-828b-f9da9ecce958-etcd-service-ca\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.629048 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.629182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bc71397-bb77-45b3-92c4-77710458d4fe-config\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.630449 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.630867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.631247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.633029 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-profile-collector-cert\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.635018 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2f612fb7-c001-4a97-b17c-008bcf100be1-proxy-tls\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.635130 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc71397-bb77-45b3-92c4-77710458d4fe-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.635842 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.636143 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21386249-439b-4454-828b-f9da9ecce958-serving-cert\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.637543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.654803 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.674048 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.676169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.694535 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.710174 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.712302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32d4b8de-5800-44a1-b2d9-338e4d267866-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.718861 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.719518 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.219503894 +0000 UTC m=+206.240619785 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.726766 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.732773 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.750721 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.771624 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.772822 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.773294 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/32d4b8de-5800-44a1-b2d9-338e4d267866-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.789817 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.812319 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.821098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.821945 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.32192467 +0000 UTC m=+206.343040631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.832798 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.835222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/31b24f51-5194-4af5-a171-bd55caaf8ded-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.854983 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.868353 4790 request.go:700] Waited for 1.008055644s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress/secrets?fieldSelector=metadata.name%3Drouter-certs-default&limit=500&resourceVersion=0 Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.873823 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.892155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.895238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-default-certificate\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.900728 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-stats-auth\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.914615 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.925136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.925475 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.425446646 +0000 UTC m=+206.446562537 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.925964 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:14 crc kubenswrapper[4790]: E0313 20:31:14.926297 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.426284638 +0000 UTC m=+206.447400529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.930020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-metrics-certs\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.931446 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.950191 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.955416 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-service-ca-bundle\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.967394 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.969010 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.970243 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.975117 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.984609 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l"] Mar 13 20:31:14 crc kubenswrapper[4790]: I0313 20:31:14.991044 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.011369 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.023878 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-proxy-tls\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.026822 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.027734 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.527713768 +0000 UTC m=+206.548829659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.031894 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-x7zgr"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.033617 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.037355 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.045028 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.049592 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.061605 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4cfd91e9-ce88-4004-b936-551d50d26a7d-srv-cert\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.069170 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.089829 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.096665 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4af8dabc-a918-4188-8257-112b5f8d71d0-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.110364 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.129570 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.130205 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.630180775 +0000 UTC m=+206.651296746 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.130731 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.139276 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4af8dabc-a918-4188-8257-112b5f8d71d0-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.152076 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.170099 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.181718 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.183264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8313e458-290f-42ba-8656-dc9dcf0e0b98-metrics-tls\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.184665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.184715 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r"] Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.189978 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: W0313 20:31:15.191913 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b88ca59_d36e_4682_99e1_10ef4fa85e10.slice/crio-8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609 WatchSource:0}: Error finding container 8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609: Status 404 returned error can't find the container with id 8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609 Mar 13 20:31:15 crc kubenswrapper[4790]: W0313 20:31:15.192956 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9680aeb7_b61a_46a8_baf5_44715261e4a5.slice/crio-7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54 WatchSource:0}: Error finding container 7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54: Status 404 returned error can't find the container with id 7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54 Mar 13 20:31:15 crc kubenswrapper[4790]: W0313 20:31:15.198400 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod869d7601_27fe_4a6a_840b_a9811c4d1e06.slice/crio-261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e WatchSource:0}: Error finding container 261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e: Status 404 returned error can't find the container with id 261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.210699 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.229936 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.231474 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.231639 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.731619904 +0000 UTC m=+206.752735795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.231910 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.232260 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.732248492 +0000 UTC m=+206.753364383 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.257545 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.259443 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8313e458-290f-42ba-8656-dc9dcf0e0b98-trusted-ca\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.271802 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.293849 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.310648 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.321500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.333312 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.333515 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.833487796 +0000 UTC m=+206.854603697 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.334294 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.334911 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.834894084 +0000 UTC m=+206.856009965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.340021 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.349559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.350254 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.370808 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.380445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-apiservice-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.381203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-webhook-cert\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.390366 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.410600 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.432913 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.435621 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.436283 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:15.936270032 +0000 UTC m=+206.957385923 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.439437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/929728d6-959b-4532-a9de-298aed7edb3f-signing-key\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.450245 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.456291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/929728d6-959b-4532-a9de-298aed7edb3f-signing-cabundle\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.470292 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.489789 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.495563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.510927 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.530795 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.536955 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.537649 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.037622989 +0000 UTC m=+207.058738880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.540537 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-serving-cert\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.550835 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.555623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-config\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.570034 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.590515 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.609917 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.620393 4790 secret.go:188] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.620495 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs podName:979fe4d1-6e0f-4b07-b994-c183a200a1cc nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.120471945 +0000 UTC m=+207.141587846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs") pod "machine-config-server-vggp9" (UID: "979fe4d1-6e0f-4b07-b994-c183a200a1cc") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624866 4790 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624890 4790 secret.go:188] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624933 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume podName:5f9c2f7c-9058-4ad2-84a2-037d212792ad nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.124914825 +0000 UTC m=+207.146030716 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume") pod "dns-default-zwfns" (UID: "5f9c2f7c-9058-4ad2-84a2-037d212792ad") : failed to sync configmap cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.624972 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token podName:979fe4d1-6e0f-4b07-b994-c183a200a1cc nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.124952736 +0000 UTC m=+207.146068717 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token") pod "machine-config-server-vggp9" (UID: "979fe4d1-6e0f-4b07-b994-c183a200a1cc") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626095 4790 secret.go:188] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626139 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert podName:bcf10b74-f8ce-4748-a813-5aefe86f13f7 nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.126128728 +0000 UTC m=+207.147244679 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert") pod "ingress-canary-cxj7h" (UID: "bcf10b74-f8ce-4748-a813-5aefe86f13f7") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626159 4790 secret.go:188] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.626191 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls podName:5f9c2f7c-9058-4ad2-84a2-037d212792ad nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.126185239 +0000 UTC m=+207.147301130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls") pod "dns-default-zwfns" (UID: "5f9c2f7c-9058-4ad2-84a2-037d212792ad") : failed to sync secret cache: timed out waiting for the condition Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.631856 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.634307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" event={"ID":"99d847db-0b8e-4128-af43-a17fe76b77d9","Type":"ContainerStarted","Data":"ebfe703c3346c51c55d0091fdef8277d072070f295dcf6c21a0d3512628de2cc"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.634357 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" event={"ID":"99d847db-0b8e-4128-af43-a17fe76b77d9","Type":"ContainerStarted","Data":"b816014f13cfdff9d0d158091e5577b68d530c30531691cf5dd060532ad4ad8b"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.636467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" event={"ID":"b8f95d7e-96c6-475c-8bef-d72937cc36b4","Type":"ContainerStarted","Data":"de9a5a029572d8130097c923ea75100942a444a6c4280d1bccce64d2d69cba59"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.636496 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" event={"ID":"b8f95d7e-96c6-475c-8bef-d72937cc36b4","Type":"ContainerStarted","Data":"b62648494389c80e3eaa2b1b2b854ec9a63c118140f231636d4806a9711e69c9"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.637900 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.638061 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.138031931 +0000 UTC m=+207.159147842 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.638523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.638618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerStarted","Data":"4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.638653 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerStarted","Data":"7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.639514 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.639845 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.139831099 +0000 UTC m=+207.160947090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.640878 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerStarted","Data":"b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.640928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerStarted","Data":"99cf4ef26fb9eb5a3a40ad496b60c26b191859906bd206806ca175b1e727b6b2"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.641734 4790 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-szftl container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" start-of-body= Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.641789 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.8:6443/healthz\": dial tcp 10.217.0.8:6443: connect: connection refused" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.641838 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.645948 4790 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ftx7g container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.646022 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.651038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.654075 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" event={"ID":"44748a56-ff71-45b3-a67a-34d5bf7ae56b","Type":"ContainerStarted","Data":"e97e6497c51c2fe5530b50c36f5849b6d8cd976e0fc685660defa3b9e67a0c15"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.654137 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" event={"ID":"44748a56-ff71-45b3-a67a-34d5bf7ae56b","Type":"ContainerStarted","Data":"48d283e7a36b98ce9ace9c712d6b36f26cae6a6fb99bf24b120f5b593ad2f89c"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.654154 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" event={"ID":"44748a56-ff71-45b3-a67a-34d5bf7ae56b","Type":"ContainerStarted","Data":"53513cfc7dd35443cac97edfbe3f8b6ea8c9c0ab7472ef6bc6ae0515a7351549"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.655686 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" event={"ID":"c5db072c-5e1d-4149-99c8-aee1209189ba","Type":"ContainerStarted","Data":"32826041d22fbf81f5b23358d558441cd59a28cb615a950d1fb409e66cbb34ab"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.655742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" event={"ID":"c5db072c-5e1d-4149-99c8-aee1209189ba","Type":"ContainerStarted","Data":"cb8e272961bb2b28d937933c7b4e6b41a719555d43e36b797bc307b7e9163e90"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.657095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" event={"ID":"ad15b399-2051-480d-8389-f58f94c10d81","Type":"ContainerStarted","Data":"da9b8ff2acb058a5d935a292de2d3d9c5023c6f57e9d87d6a5c17d9accf74e90"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.657142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" event={"ID":"ad15b399-2051-480d-8389-f58f94c10d81","Type":"ContainerStarted","Data":"874c15ececd0b63d991fbafbb9359795460fd7f07325a5f3001aadfdfe1c3ef3"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.660736 4790 generic.go:334] "Generic (PLEG): container finished" podID="4b88ca59-d36e-4682-99e1-10ef4fa85e10" containerID="781d003f62040f0fc1eb1bf495b03e746bb6b54b3e07185c148fce4a51a3a49d" exitCode=0 Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.662580 4790 generic.go:334] "Generic (PLEG): container finished" podID="1db4655f-49dd-48c8-a290-c3c4f2fb74ba" containerID="5aca94d81c2dfec69adb29425b5bbddde9204e3417b4b3e8b6253c05a7384489" exitCode=0 Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.666756 4790 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zsqd7 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.666820 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.672899 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.675978 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676015 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" event={"ID":"4b88ca59-d36e-4682-99e1-10ef4fa85e10","Type":"ContainerDied","Data":"781d003f62040f0fc1eb1bf495b03e746bb6b54b3e07185c148fce4a51a3a49d"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" event={"ID":"4b88ca59-d36e-4682-99e1-10ef4fa85e10","Type":"ContainerStarted","Data":"8216dd3e1085fcfbe8a126b55ed0b79dea9bde42cdb5342bbb76fb27ef744609"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerDied","Data":"5aca94d81c2dfec69adb29425b5bbddde9204e3417b4b3e8b6253c05a7384489"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676089 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerStarted","Data":"bafc42cbeb4a06000c3f2adeb55459b9581ddf69b803cb49804f88aa0878c560"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerStarted","Data":"b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.676105 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerStarted","Data":"261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e"} Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.690015 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.710591 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.730033 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.739745 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.739831 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.23981528 +0000 UTC m=+207.260931171 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.741174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.742261 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.242247896 +0000 UTC m=+207.263363787 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.749968 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.770850 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.790731 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.811089 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.831129 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.843402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.843692 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.343663615 +0000 UTC m=+207.364779506 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.843938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.844281 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.344272441 +0000 UTC m=+207.365388332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.849314 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.870770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.888718 4790 request.go:700] Waited for 1.955520875s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.890563 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.909615 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.944929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:15 crc kubenswrapper[4790]: E0313 20:31:15.945669 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.445633448 +0000 UTC m=+207.466749349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.976287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzpr9\" (UniqueName: \"kubernetes.io/projected/6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150-kube-api-access-hzpr9\") pod \"downloads-7954f5f757-zfhhl\" (UID: \"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150\") " pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:15 crc kubenswrapper[4790]: I0313 20:31:15.989329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6b8\" (UniqueName: \"kubernetes.io/projected/a626166a-5d74-4dd9-b838-746731bfedef-kube-api-access-vw6b8\") pod \"machine-api-operator-5694c8668f-jfdgz\" (UID: \"a626166a-5d74-4dd9-b838-746731bfedef\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.015313 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dr5wq\" (UniqueName: \"kubernetes.io/projected/94386d3d-038a-4e4d-9e97-fd04336847a0-kube-api-access-dr5wq\") pod \"console-operator-58897d9998-rmlmp\" (UID: \"94386d3d-038a-4e4d-9e97-fd04336847a0\") " pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.029411 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.046480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.046913 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.546895653 +0000 UTC m=+207.568011544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.051489 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xc7h\" (UniqueName: \"kubernetes.io/projected/071ab142-7ad6-43bc-aa6a-e6761ea33b15-kube-api-access-6xc7h\") pod \"authentication-operator-69f744f599-zws8z\" (UID: \"071ab142-7ad6-43bc-aa6a-e6761ea33b15\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.072836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.087533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv9j5\" (UniqueName: \"kubernetes.io/projected/4fa77308-6519-4481-b87b-4a1b066bada3-kube-api-access-rv9j5\") pod \"machine-approver-56656f9798-gtpkz\" (UID: \"4fa77308-6519-4481-b87b-4a1b066bada3\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.092932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.109835 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n25p\" (UniqueName: \"kubernetes.io/projected/3635b091-f7bf-4c6d-bb7a-5723b36f990f-kube-api-access-5n25p\") pod \"apiserver-7bbb656c7d-tvv7w\" (UID: \"3635b091-f7bf-4c6d-bb7a-5723b36f990f\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.136503 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"auto-csr-approver-29557230-8pqh8\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.145274 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t4tw\" (UniqueName: \"kubernetes.io/projected/21386249-439b-4454-828b-f9da9ecce958-kube-api-access-8t4tw\") pod \"etcd-operator-b45778765-kfl48\" (UID: \"21386249-439b-4454-828b-f9da9ecce958\") " pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147566 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147730 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147789 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.147802 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.647784257 +0000 UTC m=+207.668900148 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147865 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147936 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.147969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.148107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.149533 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f9c2f7c-9058-4ad2-84a2-037d212792ad-config-volume\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.150698 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.650681967 +0000 UTC m=+207.671798058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-certs\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153627 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/979fe4d1-6e0f-4b07-b994-c183a200a1cc-node-bootstrap-token\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153786 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5f9c2f7c-9058-4ad2-84a2-037d212792ad-metrics-tls\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.153967 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bcf10b74-f8ce-4748-a813-5aefe86f13f7-cert\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.154076 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.159527 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.171794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"console-f9d7485db-q5j7f\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.188036 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75sc9\" (UniqueName: \"kubernetes.io/projected/bcf10b74-f8ce-4748-a813-5aefe86f13f7-kube-api-access-75sc9\") pod \"ingress-canary-cxj7h\" (UID: \"bcf10b74-f8ce-4748-a813-5aefe86f13f7\") " pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.189678 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.212436 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42ljl\" (UniqueName: \"kubernetes.io/projected/2f612fb7-c001-4a97-b17c-008bcf100be1-kube-api-access-42ljl\") pod \"machine-config-operator-74547568cd-bm7bc\" (UID: \"2f612fb7-c001-4a97-b17c-008bcf100be1\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.214301 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.234282 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"marketplace-operator-79b997595-jnbzb\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.251885 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.252545 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.752525857 +0000 UTC m=+207.773641758 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.262565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmp7h\" (UniqueName: \"kubernetes.io/projected/31b24f51-5194-4af5-a171-bd55caaf8ded-kube-api-access-zmp7h\") pod \"package-server-manager-789f6589d5-cszm6\" (UID: \"31b24f51-5194-4af5-a171-bd55caaf8ded\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.266568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwxj6\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-kube-api-access-kwxj6\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.274641 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cxj7h" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.302037 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpwnr\" (UniqueName: \"kubernetes.io/projected/9e6c6344-8059-43d7-97be-273d115b8471-kube-api-access-gpwnr\") pod \"csi-hostpathplugin-jw27w\" (UID: \"9e6c6344-8059-43d7-97be-273d115b8471\") " pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.324749 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.327824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jvkr\" (UniqueName: \"kubernetes.io/projected/75413740-91a3-4356-8cbd-4b5d2e7ff7ac-kube-api-access-5jvkr\") pod \"packageserver-d55dfcdfc-jpkh8\" (UID: \"75413740-91a3-4356-8cbd-4b5d2e7ff7ac\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.335168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bm22j\" (UniqueName: \"kubernetes.io/projected/5f9c2f7c-9058-4ad2-84a2-037d212792ad-kube-api-access-bm22j\") pod \"dns-default-zwfns\" (UID: \"5f9c2f7c-9058-4ad2-84a2-037d212792ad\") " pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.343130 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-rmlmp"] Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.343526 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.346473 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.351523 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.354126 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.354766 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.854750487 +0000 UTC m=+207.875866378 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.360559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8rjf\" (UniqueName: \"kubernetes.io/projected/aa273b20-a91d-43ea-a18d-784ad7cdc7a7-kube-api-access-b8rjf\") pod \"service-ca-operator-777779d784-vs2vp\" (UID: \"aa273b20-a91d-43ea-a18d-784ad7cdc7a7\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.374410 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.380968 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gbdr\" (UniqueName: \"kubernetes.io/projected/c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55-kube-api-access-9gbdr\") pod \"catalog-operator-68c6474976-mcrq2\" (UID: \"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.385431 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.398826 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2b96\" (UniqueName: \"kubernetes.io/projected/979fe4d1-6e0f-4b07-b994-c183a200a1cc-kube-api-access-l2b96\") pod \"machine-config-server-vggp9\" (UID: \"979fe4d1-6e0f-4b07-b994-c183a200a1cc\") " pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.406608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8313e458-290f-42ba-8656-dc9dcf0e0b98-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9bn6p\" (UID: \"8313e458-290f-42ba-8656-dc9dcf0e0b98\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:16 crc kubenswrapper[4790]: W0313 20:31:16.421289 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94386d3d_038a_4e4d_9e97_fd04336847a0.slice/crio-b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65 WatchSource:0}: Error finding container b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65: Status 404 returned error can't find the container with id b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65 Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.441070 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.443198 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0bc71397-bb77-45b3-92c4-77710458d4fe-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-rwgfw\" (UID: \"0bc71397-bb77-45b3-92c4-77710458d4fe\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.447725 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-zws8z"] Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.450041 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.455031 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.455692 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:16.955672853 +0000 UTC m=+207.976788744 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.457703 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chlbb\" (UniqueName: \"kubernetes.io/projected/658b4bb6-837c-48ed-b5f3-aa30bd1e9740-kube-api-access-chlbb\") pod \"router-default-5444994796-pzx4q\" (UID: \"658b4bb6-837c-48ed-b5f3-aa30bd1e9740\") " pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.462162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.492914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.495437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq5sl\" (UniqueName: \"kubernetes.io/projected/d88c0d3c-4e7a-4dd8-a99d-6118b840c031-kube-api-access-hq5sl\") pod \"machine-config-controller-84d6567774-hz5vf\" (UID: \"d88c0d3c-4e7a-4dd8-a99d-6118b840c031\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.524249 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-vggp9" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.524437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbmtd\" (UniqueName: \"kubernetes.io/projected/4e8cc2ad-07fc-4d24-956e-94599d58be06-kube-api-access-qbmtd\") pod \"dns-operator-744455d44c-jtczv\" (UID: \"4e8cc2ad-07fc-4d24-956e-94599d58be06\") " pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.525312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sl24\" (UniqueName: \"kubernetes.io/projected/32d4b8de-5800-44a1-b2d9-338e4d267866-kube-api-access-8sl24\") pod \"kube-storage-version-migrator-operator-b67b599dd-v7kxq\" (UID: \"32d4b8de-5800-44a1-b2d9-338e4d267866\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.526971 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8ttg\" (UniqueName: \"kubernetes.io/projected/4cfd91e9-ce88-4004-b936-551d50d26a7d-kube-api-access-p8ttg\") pod \"olm-operator-6b444d44fb-q2wgf\" (UID: \"4cfd91e9-ce88-4004-b936-551d50d26a7d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.537450 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.553676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5dd2\" (UniqueName: \"kubernetes.io/projected/631645f5-2f1a-41e7-ba2a-a665c827acb5-kube-api-access-t5dd2\") pod \"migrator-59844c95c7-qsg78\" (UID: \"631645f5-2f1a-41e7-ba2a-a665c827acb5\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.557140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.557551 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.057534565 +0000 UTC m=+208.078650456 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.565056 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.577493 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9vps\" (UniqueName: \"kubernetes.io/projected/4af8dabc-a918-4188-8257-112b5f8d71d0-kube-api-access-x9vps\") pod \"openshift-controller-manager-operator-756b6f6bc6-8mg7x\" (UID: \"4af8dabc-a918-4188-8257-112b5f8d71d0\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.598146 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4lb2\" (UniqueName: \"kubernetes.io/projected/71ed135e-3db4-4f03-a89e-f82bc3cf0b34-kube-api-access-h4lb2\") pod \"multus-admission-controller-857f4d67dd-vp24d\" (UID: \"71ed135e-3db4-4f03-a89e-f82bc3cf0b34\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.604024 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.627521 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx89m\" (UniqueName: \"kubernetes.io/projected/929728d6-959b-4532-a9de-298aed7edb3f-kube-api-access-vx89m\") pod \"service-ca-9c57cc56f-zgzvb\" (UID: \"929728d6-959b-4532-a9de-298aed7edb3f\") " pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.636001 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.639802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"collect-profiles-29557230-rjmvn\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.653965 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3baed13c-c4c1-4fc2-9527-bfd2273efbbb-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-wksbn\" (UID: \"3baed13c-c4c1-4fc2-9527-bfd2273efbbb\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.699915 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.700012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.700487 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.200471739 +0000 UTC m=+208.221587630 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.700584 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.700862 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.701328 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.703102 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.711181 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.722733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.733716 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.770780 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.790119 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.792534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerStarted","Data":"a87ec9f1894e7d9b7b1786a0ef0e7eb51b5127f6208e76aa1e81c2bea75c5403"} Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.792594 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" event={"ID":"1db4655f-49dd-48c8-a290-c3c4f2fb74ba","Type":"ContainerStarted","Data":"9cbf9d380232bd1754c195399e9a361b22282e4e22905893b49fa8e90e03d8b6"} Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.801265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.801962 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.301938369 +0000 UTC m=+208.323054260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.877560 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" event={"ID":"071ab142-7ad6-43bc-aa6a-e6761ea33b15","Type":"ContainerStarted","Data":"2bfb08f96742cc86b7a606328d661c5f984e9f7428b3ad5d63c6930aa0e96574"} Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.975483 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:16 crc kubenswrapper[4790]: E0313 20:31:16.982694 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.482662968 +0000 UTC m=+208.503778859 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:16 crc kubenswrapper[4790]: I0313 20:31:16.982895 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.076982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.078041 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.578023243 +0000 UTC m=+208.599139224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.178748 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.178877 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.678854726 +0000 UTC m=+208.699970617 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.179271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.179701 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.679689388 +0000 UTC m=+208.700805279 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.211290 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jfdgz"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.216066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" event={"ID":"4fa77308-6519-4481-b87b-4a1b066bada3","Type":"ContainerStarted","Data":"227100ec933a6af452ff20a1fdd2ae7a6b5c83e731d26ed986ecd24ac0bd2cd4"} Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.235444 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-fhxvv" podStartSLOduration=163.235426309 podStartE2EDuration="2m43.235426309s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.234137604 +0000 UTC m=+208.255253515" watchObservedRunningTime="2026-03-13 20:31:17.235426309 +0000 UTC m=+208.256542200" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.237209 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cxj7h"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.243165 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.257782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" event={"ID":"94386d3d-038a-4e4d-9e97-fd04336847a0","Type":"ContainerStarted","Data":"b1342d6476c269b9d560d9693777a45967c9d64c93334f67af06d9dc7985ed65"} Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.272971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" event={"ID":"4b88ca59-d36e-4682-99e1-10ef4fa85e10","Type":"ContainerStarted","Data":"efc8fb8838f66d4475c37013c5a7c283b65e44000d6f8ff67542d50e99d54e3a"} Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.273062 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-zfhhl"] Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.273090 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.280463 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.288431 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.288719 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.290260 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.790240034 +0000 UTC m=+208.811355925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.391823 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.397322 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:17.897301967 +0000 UTC m=+208.918417858 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.500866 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.501624 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.001606694 +0000 UTC m=+209.022722585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.513720 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-f5jjm" podStartSLOduration=162.513702352 podStartE2EDuration="2m42.513702352s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.512784987 +0000 UTC m=+208.533900888" watchObservedRunningTime="2026-03-13 20:31:17.513702352 +0000 UTC m=+208.534818243" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.633994 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.634457 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.134445714 +0000 UTC m=+209.155561605 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.643761 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" podStartSLOduration=163.643741777 podStartE2EDuration="2m43.643741777s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.643061508 +0000 UTC m=+208.664177399" watchObservedRunningTime="2026-03-13 20:31:17.643741777 +0000 UTC m=+208.664857658" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.672559 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.735154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.735513 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.235487233 +0000 UTC m=+209.256603144 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.771999 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-bxb2l" podStartSLOduration=162.771981112 podStartE2EDuration="2m42.771981112s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:17.766231366 +0000 UTC m=+208.787347267" watchObservedRunningTime="2026-03-13 20:31:17.771981112 +0000 UTC m=+208.793097003" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.818572 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.836547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.836841 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.33682974 +0000 UTC m=+209.357945631 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:17 crc kubenswrapper[4790]: I0313 20:31:17.937113 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:17 crc kubenswrapper[4790]: E0313 20:31:17.937528 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.437509359 +0000 UTC m=+209.458625250 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.040225 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.040772 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.540760158 +0000 UTC m=+209.561876049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.103131 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" podStartSLOduration=163.103116028 podStartE2EDuration="2m43.103116028s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.100042504 +0000 UTC m=+209.121158395" watchObservedRunningTime="2026-03-13 20:31:18.103116028 +0000 UTC m=+209.124231919" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.141585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.142142 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.64193715 +0000 UTC m=+209.663053041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.169930 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" podStartSLOduration=163.169900128 podStartE2EDuration="2m43.169900128s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.156707321 +0000 UTC m=+209.177823212" watchObservedRunningTime="2026-03-13 20:31:18.169900128 +0000 UTC m=+209.191016049" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.242582 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.248317 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.248753 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.748736975 +0000 UTC m=+209.769852916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: W0313 20:31:18.314185 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f612fb7_c001_4a97_b17c_008bcf100be1.slice/crio-7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6 WatchSource:0}: Error finding container 7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6: Status 404 returned error can't find the container with id 7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6 Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.314650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pzx4q" event={"ID":"658b4bb6-837c-48ed-b5f3-aa30bd1e9740","Type":"ContainerStarted","Data":"b42249f3cd9250bbc6e5200abf52b5b898153886f7c6af8892dbd5b3671bbbc1"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.316747 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-9l97v" podStartSLOduration=164.316734579 podStartE2EDuration="2m44.316734579s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.314531939 +0000 UTC m=+209.335647830" watchObservedRunningTime="2026-03-13 20:31:18.316734579 +0000 UTC m=+209.337850470" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.347211 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" event={"ID":"d598b7c0-7c77-4903-9138-d8a3d01f9efe","Type":"ContainerStarted","Data":"3851738f410766329c5133a13a2bdd38c600122354cde8d6b4c645c3b69815b7"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.349920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.350298 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.850282458 +0000 UTC m=+209.871398349 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.371153 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" event={"ID":"a626166a-5d74-4dd9-b838-746731bfedef","Type":"ContainerStarted","Data":"660f5500f3fb94b3dbe414e94cd324b8a637bddb7665324b98b41825393454a0"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.396978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vggp9" event={"ID":"979fe4d1-6e0f-4b07-b994-c183a200a1cc","Type":"ContainerStarted","Data":"1a1addd1d481e0bd74e584a8d677c2f02520a9773b0da658e9f30ec883c4da25"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.436059 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" event={"ID":"94386d3d-038a-4e4d-9e97-fd04336847a0","Type":"ContainerStarted","Data":"a84716ffc7fb5f6120fa4da6d9fe9147bd141b929386b6b944fa920bcd3f7794"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.436695 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.441078 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-rmlmp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.441120 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" podUID="94386d3d-038a-4e4d-9e97-fd04336847a0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.449799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cxj7h" event={"ID":"bcf10b74-f8ce-4748-a813-5aefe86f13f7","Type":"ContainerStarted","Data":"5a65029229d0cab9fcd1f3d47d5f233fa9fb2bd8556317970877f9af1851b06f"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.451011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.451347 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:18.951335017 +0000 UTC m=+209.972450908 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.464474 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerStarted","Data":"b436cb5e3f5d468aba071bbb52490fe41be9f758fe2861c288ae2f9dacadcab0"} Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.512528 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qr47g" podStartSLOduration=163.512513675 podStartE2EDuration="2m43.512513675s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.510876541 +0000 UTC m=+209.531992432" watchObservedRunningTime="2026-03-13 20:31:18.512513675 +0000 UTC m=+209.533629566" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.522047 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.529768 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.548188 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-zwfns"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.553136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.554345 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.054325839 +0000 UTC m=+210.075441730 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.576504 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.578510 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-kfl48"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.591010 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.651586 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.654934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.656887 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.156872918 +0000 UTC m=+210.177988809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.665874 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6"] Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.689451 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" podStartSLOduration=164.689431971 podStartE2EDuration="2m44.689431971s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.666612682 +0000 UTC m=+209.687728573" watchObservedRunningTime="2026-03-13 20:31:18.689431971 +0000 UTC m=+209.710547862" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.691088 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" podStartSLOduration=164.691082625 podStartE2EDuration="2m44.691082625s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.688128555 +0000 UTC m=+209.709244456" watchObservedRunningTime="2026-03-13 20:31:18.691082625 +0000 UTC m=+209.712198516" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.753967 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-7ql4r" podStartSLOduration=164.753944829 podStartE2EDuration="2m44.753944829s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:18.753025744 +0000 UTC m=+209.774141655" watchObservedRunningTime="2026-03-13 20:31:18.753944829 +0000 UTC m=+209.775060720" Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.756029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.756646 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.256625612 +0000 UTC m=+210.277741503 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.860504 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.860991 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.36097677 +0000 UTC m=+210.382092661 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.961976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.962250 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.462231144 +0000 UTC m=+210.483347045 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:18 crc kubenswrapper[4790]: I0313 20:31:18.962628 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:18 crc kubenswrapper[4790]: E0313 20:31:18.963004 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.462993515 +0000 UTC m=+210.484109396 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.012533 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.064156 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.064579 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.564564119 +0000 UTC m=+210.585680010 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.167262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.167911 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.66789502 +0000 UTC m=+210.689010911 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.269151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.269515 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.769497783 +0000 UTC m=+210.790613674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.376703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.377453 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.877430768 +0000 UTC m=+210.898546829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.451267 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.462091 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.475311 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.476972 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-jtczv"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.477431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.477824 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:19.977809529 +0000 UTC m=+210.998925430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.478813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.482159 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-vp24d"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.500192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-zgzvb"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.516962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerStarted","Data":"e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.519868 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.523075 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.523121 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.535506 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.537323 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.542882 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-zfhhl" podStartSLOduration=165.542862662 podStartE2EDuration="2m45.542862662s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:19.538787932 +0000 UTC m=+210.559903823" watchObservedRunningTime="2026-03-13 20:31:19.542862662 +0000 UTC m=+210.563978563" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.580640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.581138 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.08112311 +0000 UTC m=+211.102239001 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.611962 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zwfns" event={"ID":"5f9c2f7c-9058-4ad2-84a2-037d212792ad","Type":"ContainerStarted","Data":"0eedb6e2e026bf0350d7cfb4fedfa87785c3792514589d8122f9b4d9cd911bcb"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.634835 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-jw27w"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.634884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-vggp9" event={"ID":"979fe4d1-6e0f-4b07-b994-c183a200a1cc","Type":"ContainerStarted","Data":"7764814c3f81c716083fe3d17dce5b432444f2236b1e438d519d3d6c955d6ac3"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.657169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" event={"ID":"4fa77308-6519-4481-b87b-4a1b066bada3","Type":"ContainerStarted","Data":"f25db4438bee6f278153a4aaf42cb8022dd0c3a47d7c97ce09868dc42bec3cf1"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.681303 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-vggp9" podStartSLOduration=6.681282565 podStartE2EDuration="6.681282565s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:19.678485149 +0000 UTC m=+210.699601060" watchObservedRunningTime="2026-03-13 20:31:19.681282565 +0000 UTC m=+210.702398456" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.681867 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.683292 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.183261149 +0000 UTC m=+211.204377040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.686400 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.686954 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.186940508 +0000 UTC m=+211.208056399 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.787166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.788410 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.288391868 +0000 UTC m=+211.309507759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.825361 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" event={"ID":"8313e458-290f-42ba-8656-dc9dcf0e0b98","Type":"ContainerStarted","Data":"e7af220ef9cb06ca1618fb14587d09a03ba5e8c64c063b4963a7dff1277a1dfa"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831410 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" event={"ID":"8313e458-290f-42ba-8656-dc9dcf0e0b98","Type":"ContainerStarted","Data":"2c7848e321e3bffdebe62b492ffe330b9d19d7a800b06f31cba20c02985722d0"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831443 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" event={"ID":"3635b091-f7bf-4c6d-bb7a-5723b36f990f","Type":"ContainerStarted","Data":"2fe35e99845f5e357405009c0aeae1924f14ff443ac7a27625d938eed41ee4c9"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831459 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831476 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-pzx4q" event={"ID":"658b4bb6-837c-48ed-b5f3-aa30bd1e9740","Type":"ContainerStarted","Data":"ef8810e1999c0e38d934d579b1b6991b98c1734764e9479e6ac6a38a3aac4d83"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" event={"ID":"31b24f51-5194-4af5-a171-bd55caaf8ded","Type":"ContainerStarted","Data":"de0f1d50be7a8f2eacacb074e61715f55608c2fedfc108a8f20a00f9559d0971"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831498 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831510 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831519 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" event={"ID":"2f612fb7-c001-4a97-b17c-008bcf100be1","Type":"ContainerStarted","Data":"3c0be889c426f02bf89b75e3c89fa6c3a6dbbffe59e1eea58f2f512e037baa1d"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831572 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf"] Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831587 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" event={"ID":"2f612fb7-c001-4a97-b17c-008bcf100be1","Type":"ContainerStarted","Data":"7a46ca8e0bdbb4d72f3d47182d95660d4ed68daa8376f6819cafacb28a6f0ed6"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831604 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" event={"ID":"21386249-439b-4454-828b-f9da9ecce958","Type":"ContainerStarted","Data":"c6a883977c0139d5e2572820fd8eb84305881145c03efffe0c863048ab150134"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831617 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" event={"ID":"071ab142-7ad6-43bc-aa6a-e6761ea33b15","Type":"ContainerStarted","Data":"257694f26f804f72df81d6133e0baf8090ad14d327a03a5c9806ac387bc5f050"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" event={"ID":"aa273b20-a91d-43ea-a18d-784ad7cdc7a7","Type":"ContainerStarted","Data":"9adf4c0180dfbe028bdf748ff6214a4ac18946f1fa1412ba91b6e4cf19a21205"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.831645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" event={"ID":"75413740-91a3-4356-8cbd-4b5d2e7ff7ac","Type":"ContainerStarted","Data":"bc28c370c2cd3eea34feb8211cc516775cf455ee4b1d4b7b2006a39cf1fbccca"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.833438 4790 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jpkh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" start-of-body= Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.833487 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" podUID="75413740-91a3-4356-8cbd-4b5d2e7ff7ac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": dial tcp 10.217.0.37:5443: connect: connection refused" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.838893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" event={"ID":"a626166a-5d74-4dd9-b838-746731bfedef","Type":"ContainerStarted","Data":"ee9bca9a81de3dc588b20df5dee11de8f4a9b928f5c22d19b9d18575c546ec8b"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.848061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerStarted","Data":"af91b2c2002cfba8d95ebe9f9e0aa50107b9d61f68613dde04ff9ae4ab302650"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.884651 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cxj7h" event={"ID":"bcf10b74-f8ce-4748-a813-5aefe86f13f7","Type":"ContainerStarted","Data":"faa34ab9170c82fd862ea9ca00bc38b95f4e671591c94fdc7f41df6a91938ca8"} Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.884961 4790 patch_prober.go:28] interesting pod/console-operator-58897d9998-rmlmp container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" start-of-body= Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.884995 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" podUID="94386d3d-038a-4e4d-9e97-fd04336847a0" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.10:8443/readyz\": dial tcp 10.217.0.10:8443: connect: connection refused" Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.888995 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.901374 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.401348239 +0000 UTC m=+211.422464130 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.994948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:19 crc kubenswrapper[4790]: E0313 20:31:19.996401 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.496360485 +0000 UTC m=+211.517476376 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:19 crc kubenswrapper[4790]: I0313 20:31:19.997888 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.005681 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.505663338 +0000 UTC m=+211.526779299 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.084794 4790 patch_prober.go:28] interesting pod/apiserver-76f77b778f-x7zgr container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]log ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]etcd ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/generic-apiserver-start-informers failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/max-in-flight-filter failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/storage-object-count-tracker-hook failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectcache ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 13 20:31:20 crc kubenswrapper[4790]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 13 20:31:20 crc kubenswrapper[4790]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 20:31:20 crc kubenswrapper[4790]: livez check failed Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.084860 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" podUID="1db4655f-49dd-48c8-a290-c3c4f2fb74ba" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.104838 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.105238 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.605220226 +0000 UTC m=+211.626336117 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.206281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.206673 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.706657595 +0000 UTC m=+211.727773486 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.308426 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.308617 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.808587178 +0000 UTC m=+211.829703069 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.309072 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.309484 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.809470912 +0000 UTC m=+211.830586803 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.340730 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" podStartSLOduration=165.340712769 podStartE2EDuration="2m45.340712769s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.304790695 +0000 UTC m=+211.325906596" watchObservedRunningTime="2026-03-13 20:31:20.340712769 +0000 UTC m=+211.361828660" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.405132 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-zws8z" podStartSLOduration=166.405110534 podStartE2EDuration="2m46.405110534s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.341190981 +0000 UTC m=+211.362306872" watchObservedRunningTime="2026-03-13 20:31:20.405110534 +0000 UTC m=+211.426226425" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.411731 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.423044 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-pzx4q" podStartSLOduration=165.42302445 podStartE2EDuration="2m45.42302445s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.419956597 +0000 UTC m=+211.441072488" watchObservedRunningTime="2026-03-13 20:31:20.42302445 +0000 UTC m=+211.444140341" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.423413 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-q5j7f" podStartSLOduration=166.423407461 podStartE2EDuration="2m46.423407461s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.388620337 +0000 UTC m=+211.409736228" watchObservedRunningTime="2026-03-13 20:31:20.423407461 +0000 UTC m=+211.444523352" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.429521 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:20.929483195 +0000 UTC m=+211.950599076 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.459671 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cxj7h" podStartSLOduration=7.459643163 podStartE2EDuration="7.459643163s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.457233087 +0000 UTC m=+211.478348978" watchObservedRunningTime="2026-03-13 20:31:20.459643163 +0000 UTC m=+211.480759054" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.522167 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.524290 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.024270324 +0000 UTC m=+212.045386215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.638093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.638532 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.138509461 +0000 UTC m=+212.159625352 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.703453 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.714013 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:20 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:20 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.714074 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.740034 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.740429 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.240418493 +0000 UTC m=+212.261534384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.841401 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.842097 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.342078998 +0000 UTC m=+212.363194889 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.906982 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerStarted","Data":"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.907031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerStarted","Data":"bad985ac5d6a6fd6a14b185a97704f5e25df7aba222388f921733e6977b5b5eb"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.907611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.912187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" event={"ID":"71ed135e-3db4-4f03-a89e-f82bc3cf0b34","Type":"ContainerStarted","Data":"21dd5065f044c4cde3c563bc54cd1a1d73c878a3e4d4ae12f5615c1c41fddac6"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.921434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" event={"ID":"2f612fb7-c001-4a97-b17c-008bcf100be1","Type":"ContainerStarted","Data":"a1a65fb61f0ec00e53a4d527c48b1b58cdcd0982d95fec32f4e5401038a3b50d"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.937618 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jnbzb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.937712 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.943295 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" event={"ID":"aa273b20-a91d-43ea-a18d-784ad7cdc7a7","Type":"ContainerStarted","Data":"340e5a5971916a43c5b3ad0ffd63cb363edcbf940b0c07728572b29458170d7e"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.944555 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podStartSLOduration=165.944536756 podStartE2EDuration="2m45.944536756s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.9443025 +0000 UTC m=+211.965418401" watchObservedRunningTime="2026-03-13 20:31:20.944536756 +0000 UTC m=+211.965652647" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.945253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:20 crc kubenswrapper[4790]: E0313 20:31:20.945605 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.445592194 +0000 UTC m=+212.466708085 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.956042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" event={"ID":"32d4b8de-5800-44a1-b2d9-338e4d267866","Type":"ContainerStarted","Data":"ce5f8fb1afded31f135561bc74fdabae9e87ea779c474a5d7e5363e7393d45f1"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.956095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" event={"ID":"32d4b8de-5800-44a1-b2d9-338e4d267866","Type":"ContainerStarted","Data":"afcd681e3ca3f274a0d102b69be8865accb5e4227dd4cf0cc0d2bee3d1b47374"} Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.988613 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bm7bc" podStartSLOduration=165.98859576 podStartE2EDuration="2m45.98859576s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:20.970855889 +0000 UTC m=+211.991971780" watchObservedRunningTime="2026-03-13 20:31:20.98859576 +0000 UTC m=+212.009711651" Mar 13 20:31:20 crc kubenswrapper[4790]: I0313 20:31:20.993470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" event={"ID":"a626166a-5d74-4dd9-b838-746731bfedef","Type":"ContainerStarted","Data":"5a42b8b73b9fdb6bdfefe8035ac61f25ee593a39779349eb3dea10234e77a77a"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.049321 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.049504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" event={"ID":"929728d6-959b-4532-a9de-298aed7edb3f","Type":"ContainerStarted","Data":"ce676faf3686152a6bfeb6cc2c2d8447af229cebf15ff6aa7a4557faadf52069"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.049550 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" event={"ID":"929728d6-959b-4532-a9de-298aed7edb3f","Type":"ContainerStarted","Data":"878f640622a6e67c5340b650a71d21e2cd3303a67d77486fed168166e4311f6b"} Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.051104 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.551083674 +0000 UTC m=+212.572199575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.058566 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-vs2vp" podStartSLOduration=166.057334983 podStartE2EDuration="2m46.057334983s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.013144765 +0000 UTC m=+212.034260656" watchObservedRunningTime="2026-03-13 20:31:21.057334983 +0000 UTC m=+212.078450874" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.076518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" event={"ID":"21386249-439b-4454-828b-f9da9ecce958","Type":"ContainerStarted","Data":"9f5fd8b34062b016215dca4785a4d76e453c761e8bd759cc3e3d46dbbbc45394"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.078071 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v7kxq" podStartSLOduration=166.078057724 podStartE2EDuration="2m46.078057724s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.048451823 +0000 UTC m=+212.069567724" watchObservedRunningTime="2026-03-13 20:31:21.078057724 +0000 UTC m=+212.099173625" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.090490 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jfdgz" podStartSLOduration=166.090467521 podStartE2EDuration="2m46.090467521s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.075846395 +0000 UTC m=+212.096962296" watchObservedRunningTime="2026-03-13 20:31:21.090467521 +0000 UTC m=+212.111583422" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.124524 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-zgzvb" podStartSLOduration=166.124505224 podStartE2EDuration="2m46.124505224s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.12216531 +0000 UTC m=+212.143281221" watchObservedRunningTime="2026-03-13 20:31:21.124505224 +0000 UTC m=+212.145621115" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.134914 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.135981 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.144992 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.145272 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.156898 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.159542 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.659525684 +0000 UTC m=+212.680641575 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.161954 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-kfl48" podStartSLOduration=166.161937408 podStartE2EDuration="2m46.161937408s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.155821223 +0000 UTC m=+212.176937124" watchObservedRunningTime="2026-03-13 20:31:21.161937408 +0000 UTC m=+212.183053299" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.175627 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.178963 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" event={"ID":"631645f5-2f1a-41e7-ba2a-a665c827acb5","Type":"ContainerStarted","Data":"e1debe9bc4fb5d653959ee72a515b194439e55c71d3c8cf3bb1c4e3160853ed3"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.179020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" event={"ID":"631645f5-2f1a-41e7-ba2a-a665c827acb5","Type":"ContainerStarted","Data":"4ccdcb482ed2307bbee84e2eeed0b6f5a304d21663576eaa76d1c3639c1c214a"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.191897 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerStarted","Data":"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.210773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerStarted","Data":"93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.210817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerStarted","Data":"1809f43b88080170a440a364505c4febd360a062e9e4aabd772262f808d67b1c"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.213098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" event={"ID":"0bc71397-bb77-45b3-92c4-77710458d4fe","Type":"ContainerStarted","Data":"8b273120e9e6fa6d21db14e7f4043b1874784896f97e4fb7f7e2509ebaba0d0e"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.228514 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" event={"ID":"4af8dabc-a918-4188-8257-112b5f8d71d0","Type":"ContainerStarted","Data":"4fff32c4ea6c00908e2e551a63c388c3c5d8ae382436561b2c16d0fbeeacdf04"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.238145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" event={"ID":"8313e458-290f-42ba-8656-dc9dcf0e0b98","Type":"ContainerStarted","Data":"34345c9d241cc9a8ee4f6aefd62ef1d5124244f04a93c3f3cc8ce155d93b0a68"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.247260 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zwfns" event={"ID":"5f9c2f7c-9058-4ad2-84a2-037d212792ad","Type":"ContainerStarted","Data":"cdaf2a7157fded22fd5efff64f4b448f805d453cc47c87b988c689fd7a955583"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.257795 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" podStartSLOduration=81.257779057 podStartE2EDuration="1m21.257779057s" podCreationTimestamp="2026-03-13 20:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.236748456 +0000 UTC m=+212.257864357" watchObservedRunningTime="2026-03-13 20:31:21.257779057 +0000 UTC m=+212.278894948" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.259481 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.259832 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.260199 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.760175751 +0000 UTC m=+212.781291662 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.260498 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.260869 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.260998 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.760987814 +0000 UTC m=+212.782103765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.272715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" event={"ID":"4e8cc2ad-07fc-4d24-956e-94599d58be06","Type":"ContainerStarted","Data":"bc2f09c9de4d6b44e872e0283c9d0d073285dbdc5330b514dd93f93bf6f21a25"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.272795 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" event={"ID":"4e8cc2ad-07fc-4d24-956e-94599d58be06","Type":"ContainerStarted","Data":"73101d384387fa131c32f69a2704201e88074c0216cd838677f8af7c2487d487"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.283455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"89206cfdc78475e8ddb6a6001e20974b19dbaaadccf47cbc7348fe466c89a707"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.284930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" event={"ID":"4fa77308-6519-4481-b87b-4a1b066bada3","Type":"ContainerStarted","Data":"f8f3ef1fa0a5c5987f219c07af21805cc56c30af75e76960d8709d781e5bcbe4"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.286705 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" event={"ID":"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55","Type":"ContainerStarted","Data":"9cf6faf1a15f9ad2088ec01ab75d065c7c611763631a16bffb9c50a25548558b"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.287417 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.297052 4790 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-mcrq2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.297116 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" podUID="c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.300932 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" event={"ID":"4cfd91e9-ce88-4004-b936-551d50d26a7d","Type":"ContainerStarted","Data":"b62be7a9fef85b12a5dc19d34e61675a0b73a7b4e7b5dc0d47df16a785b5e008"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.300979 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" event={"ID":"4cfd91e9-ce88-4004-b936-551d50d26a7d","Type":"ContainerStarted","Data":"ca4753864246a2b6c303ac90d6f0160c76b073a0ba1800b707c270a9f5dc8841"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.324582 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.327027 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9bn6p" podStartSLOduration=166.326995942 podStartE2EDuration="2m46.326995942s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.323733264 +0000 UTC m=+212.344849165" watchObservedRunningTime="2026-03-13 20:31:21.326995942 +0000 UTC m=+212.348111833" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.329966 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" podStartSLOduration=166.329954812 podStartE2EDuration="2m46.329954812s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.26015324 +0000 UTC m=+212.281269141" watchObservedRunningTime="2026-03-13 20:31:21.329954812 +0000 UTC m=+212.351070703" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.340995 4790 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-q2wgf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" start-of-body= Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.341170 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" podUID="4cfd91e9-ce88-4004-b936-551d50d26a7d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.29:8443/healthz\": dial tcp 10.217.0.29:8443: connect: connection refused" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.343683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" event={"ID":"31b24f51-5194-4af5-a171-bd55caaf8ded","Type":"ContainerStarted","Data":"293d549b7bb7c41d24c912435a30ac98598b11c8ed0e5af3cfe6278641178582"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.345007 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.368044 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.368399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.368646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.369425 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.869406972 +0000 UTC m=+212.890522863 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.371607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.375827 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" podStartSLOduration=166.375804616 podStartE2EDuration="2m46.375804616s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.372583609 +0000 UTC m=+212.393699510" watchObservedRunningTime="2026-03-13 20:31:21.375804616 +0000 UTC m=+212.396920507" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.378110 4790 generic.go:334] "Generic (PLEG): container finished" podID="3635b091-f7bf-4c6d-bb7a-5723b36f990f" containerID="1f34a294151ef6b1c2c7705742307a127d627ac02b120c12574cf0048804f635" exitCode=0 Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.378206 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" event={"ID":"3635b091-f7bf-4c6d-bb7a-5723b36f990f","Type":"ContainerDied","Data":"1f34a294151ef6b1c2c7705742307a127d627ac02b120c12574cf0048804f635"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.407316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.411625 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" podStartSLOduration=166.411554114 podStartE2EDuration="2m46.411554114s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.400698861 +0000 UTC m=+212.421814762" watchObservedRunningTime="2026-03-13 20:31:21.411554114 +0000 UTC m=+212.432670005" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.428955 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" event={"ID":"3baed13c-c4c1-4fc2-9527-bfd2273efbbb","Type":"ContainerStarted","Data":"5198eb8751d46e63c5d90956f3221921504590f0af9ace66aa9925f094473df7"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.458004 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" event={"ID":"75413740-91a3-4356-8cbd-4b5d2e7ff7ac","Type":"ContainerStarted","Data":"a0ef3e6162bf90124e9ee5a7a397ff11c686b96e7f49e283e53be6b7c0e7ccc2"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.460325 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gtpkz" podStartSLOduration=167.460308936 podStartE2EDuration="2m47.460308936s" podCreationTimestamp="2026-03-13 20:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.430722564 +0000 UTC m=+212.451838455" watchObservedRunningTime="2026-03-13 20:31:21.460308936 +0000 UTC m=+212.481424827" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.469531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.470057 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:21.970043199 +0000 UTC m=+212.991159100 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.473804 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" event={"ID":"d88c0d3c-4e7a-4dd8-a99d-6118b840c031","Type":"ContainerStarted","Data":"7c62368176591927ba848072fa4aeb9ff3ebb04bdfaaa36069f9cbbe368c8b44"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.474119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" event={"ID":"d88c0d3c-4e7a-4dd8-a99d-6118b840c031","Type":"ContainerStarted","Data":"826f2a917cb49a4c81678f749c3be86fc911427c2767f16af7f1d57bf83d6e66"} Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.495406 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.495466 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.502653 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" podStartSLOduration=166.502637613 podStartE2EDuration="2m46.502637613s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.460914322 +0000 UTC m=+212.482030213" watchObservedRunningTime="2026-03-13 20:31:21.502637613 +0000 UTC m=+212.523753504" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.503028 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.588907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.589495 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" podStartSLOduration=166.589475777 podStartE2EDuration="2m46.589475777s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.527915348 +0000 UTC m=+212.549031239" watchObservedRunningTime="2026-03-13 20:31:21.589475777 +0000 UTC m=+212.610591668" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.590400 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.090362761 +0000 UTC m=+213.111478652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694765 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694866 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.694939 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.698847 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.198828331 +0000 UTC m=+213.219944222 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.699208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.700529 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.701112 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.702997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.707240 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:21 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:21 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:21 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.707278 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.798092 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.29804722 +0000 UTC m=+213.319163111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.809223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.809543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.809604 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.810538 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.310521869 +0000 UTC m=+213.331637760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.818977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c54336a0-5a12-4bf9-9807-337dd352fdb6-metrics-certs\") pod \"network-metrics-daemon-mnf26\" (UID: \"c54336a0-5a12-4bf9-9807-337dd352fdb6\") " pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.893018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.899878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.911197 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.912585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:21 crc kubenswrapper[4790]: E0313 20:31:21.912871 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.412843912 +0000 UTC m=+213.433959803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:21 crc kubenswrapper[4790]: I0313 20:31:21.915936 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mnf26" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.014689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.015449 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.515434703 +0000 UTC m=+213.536550594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.115513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.116519 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.616500172 +0000 UTC m=+213.637616063 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.131925 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" podStartSLOduration=167.1319062 podStartE2EDuration="2m47.1319062s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:21.599481019 +0000 UTC m=+212.620596920" watchObservedRunningTime="2026-03-13 20:31:22.1319062 +0000 UTC m=+213.153022101" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.134129 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 13 20:31:22 crc kubenswrapper[4790]: W0313 20:31:22.167400 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod09cce78a_6bee_4201_82d7_a4e0dd041c9f.slice/crio-78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598 WatchSource:0}: Error finding container 78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598: Status 404 returned error can't find the container with id 78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598 Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.217254 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.217652 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.717635523 +0000 UTC m=+213.738751414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.320575 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.321061 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.821037166 +0000 UTC m=+213.842153057 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.421914 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.422293 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:22.92227415 +0000 UTC m=+213.943390041 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.458442 4790 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jpkh8 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:5443/healthz\": context deadline exceeded" start-of-body= Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.458510 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" podUID="75413740-91a3-4356-8cbd-4b5d2e7ff7ac" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.37:5443/healthz\": context deadline exceeded" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.465975 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34002: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.523808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.524167 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.024151982 +0000 UTC m=+214.045267873 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.536114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" event={"ID":"631645f5-2f1a-41e7-ba2a-a665c827acb5","Type":"ContainerStarted","Data":"fd32340d1498e106ddfdfcea79637a893f01d9582eaa1568adeb1c2bb2e6b827"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.564497 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-qsg78" podStartSLOduration=167.564479715 podStartE2EDuration="2m47.564479715s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.563889169 +0000 UTC m=+213.585005060" watchObservedRunningTime="2026-03-13 20:31:22.564479715 +0000 UTC m=+213.585595606" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.573575 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" event={"ID":"4e8cc2ad-07fc-4d24-956e-94599d58be06","Type":"ContainerStarted","Data":"db20261fe3f06ccd31a0b3d8e807bbc22bd1ed42651686b532bf934a6366cb7c"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.585826 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34008: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.612754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-hz5vf" event={"ID":"d88c0d3c-4e7a-4dd8-a99d-6118b840c031","Type":"ContainerStarted","Data":"6f8fbae1ffd9531f9d9211edbdf7ab1cecaf96d184d3a4e60ed1ffbc8ac0fcaa"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.623055 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-jtczv" podStartSLOduration=167.623033552 podStartE2EDuration="2m47.623033552s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.619802885 +0000 UTC m=+213.640918786" watchObservedRunningTime="2026-03-13 20:31:22.623033552 +0000 UTC m=+213.644149443" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.628142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.628450 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.128438999 +0000 UTC m=+214.149554890 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.641733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8mg7x" event={"ID":"4af8dabc-a918-4188-8257-112b5f8d71d0","Type":"ContainerStarted","Data":"4b1fa189bc57f47fde13d528521c81259420b8568b801996e6c3cfa04b4187ec"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.666712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-zwfns" event={"ID":"5f9c2f7c-9058-4ad2-84a2-037d212792ad","Type":"ContainerStarted","Data":"9a63e70d6925e6cb469f20bebea13af4cebd4aed1f83a98e3f2e2e1ef7b88237"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.667070 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.680485 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" event={"ID":"31b24f51-5194-4af5-a171-bd55caaf8ded","Type":"ContainerStarted","Data":"48e4f1927995071e9e6ec8613fddac83aa2a0b65d33722eb0aa5845873e7e4d2"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.719803 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:22 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:22 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:22 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.720135 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.721650 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34016: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.729631 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.729939 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.229913489 +0000 UTC m=+214.251029390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.732162 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.736413 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.236373234 +0000 UTC m=+214.257489125 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.742133 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-zwfns" podStartSLOduration=9.742110479 podStartE2EDuration="9.742110479s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.71963948 +0000 UTC m=+213.740755371" watchObservedRunningTime="2026-03-13 20:31:22.742110479 +0000 UTC m=+213.763226370" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.756981 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" event={"ID":"3635b091-f7bf-4c6d-bb7a-5723b36f990f","Type":"ContainerStarted","Data":"ddec58bb3589a072b686abb8229af2b81ca1aa84380efb2669cc93395f70497a"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.771961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-wksbn" event={"ID":"3baed13c-c4c1-4fc2-9527-bfd2273efbbb","Type":"ContainerStarted","Data":"d5240bb98341498587525757f6e4ac55183c53771ddc0ee8c8ca6abf568dec2f"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.775818 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"09cce78a-6bee-4201-82d7-a4e0dd041c9f","Type":"ContainerStarted","Data":"78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.808131 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" podStartSLOduration=167.808106499 podStartE2EDuration="2m47.808106499s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.801114699 +0000 UTC m=+213.822230600" watchObservedRunningTime="2026-03-13 20:31:22.808106499 +0000 UTC m=+213.829222410" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.837325 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.837480 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.337451783 +0000 UTC m=+214.358567674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.837934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.838309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" event={"ID":"0bc71397-bb77-45b3-92c4-77710458d4fe","Type":"ContainerStarted","Data":"49c5d026eb64b4d5647200c969b2b62605404470459ddf55f9ef06a7895500c5"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.839810 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34018: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.840093 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.340081715 +0000 UTC m=+214.361197606 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.853571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" event={"ID":"c8f3a12d-c2c6-4f1c-a46e-4546ae08ae55","Type":"ContainerStarted","Data":"da1d931701622f01a7bf5ec6ba53ffdf4132f55b9cab7471ac5b57db237b6ce3"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.867067 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mcrq2" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.884213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" event={"ID":"71ed135e-3db4-4f03-a89e-f82bc3cf0b34","Type":"ContainerStarted","Data":"898f5a100df4238843c8507dc6c1ed32963c9d645ec72bd78f8daa0e5433a92b"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.884247 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" event={"ID":"71ed135e-3db4-4f03-a89e-f82bc3cf0b34","Type":"ContainerStarted","Data":"9a6fe40383d5e82d52a9783b6b1e61846ba488ba17a8a6e15b807ad841fc746e"} Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885749 4790 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-jnbzb container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885763 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885788 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.885821 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.889155 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-rwgfw" podStartSLOduration=167.889139955 podStartE2EDuration="2m47.889139955s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:22.878318522 +0000 UTC m=+213.899434423" watchObservedRunningTime="2026-03-13 20:31:22.889139955 +0000 UTC m=+213.910255846" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.897995 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jpkh8" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.929605 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-q2wgf" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.930024 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34034: no serving certificate available for the kubelet" Mar 13 20:31:22 crc kubenswrapper[4790]: I0313 20:31:22.941215 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:22 crc kubenswrapper[4790]: E0313 20:31:22.943982 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.443955491 +0000 UTC m=+214.465071392 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:22 crc kubenswrapper[4790]: W0313 20:31:22.964634 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3 WatchSource:0}: Error finding container e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3: Status 404 returned error can't find the container with id e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3 Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.044081 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.044745 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.544728982 +0000 UTC m=+214.565844873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.057488 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-vp24d" podStartSLOduration=168.057465827 podStartE2EDuration="2m48.057465827s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:23.010923605 +0000 UTC m=+214.032039516" watchObservedRunningTime="2026-03-13 20:31:23.057465827 +0000 UTC m=+214.078581728" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.083160 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34038: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.095754 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mnf26"] Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.145192 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.145456 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.645438442 +0000 UTC m=+214.666554333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.248112 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34052: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.249141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.249574 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.749553925 +0000 UTC m=+214.770669806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.334643 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34066: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.357560 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.357968 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.857949502 +0000 UTC m=+214.879065393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.462090 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.462460 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:23.962447635 +0000 UTC m=+214.983563526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.562953 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.563227 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.063206676 +0000 UTC m=+215.084322567 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.655586 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34074: no serving certificate available for the kubelet" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.663916 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.664239 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.164227125 +0000 UTC m=+215.185343026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.710521 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:23 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:23 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:23 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.710579 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.765649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.765847 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.265817138 +0000 UTC m=+215.286933029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.765956 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.766327 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.266312281 +0000 UTC m=+215.287428172 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.866518 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.866670 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.366646551 +0000 UTC m=+215.387762452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.866734 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.867058 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.367046962 +0000 UTC m=+215.388162873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.906200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mnf26" event={"ID":"c54336a0-5a12-4bf9-9807-337dd352fdb6","Type":"ContainerStarted","Data":"0b6794c9fe65f322c28666c06c92a498ea123712d515a209d34b5f14547c9762"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.906265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mnf26" event={"ID":"c54336a0-5a12-4bf9-9807-337dd352fdb6","Type":"ContainerStarted","Data":"a10f7b028aa807afab8b4ec493ac5112136565647beb352116cc09abd06040c3"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.907621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"76a39aadffb3c2d5cb1dd7bb82e2424e525eeb3d16ec6e80c6d388ce2f9367ba"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.907648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"e01da82f6f5bf61206a53cb3ab6efa1113b6fb4c1e9510d6d10b9d4395b585a3"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.908532 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.935627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"5a9565f34a0cfe251d0647d382da9853c80accd576c3333cc9f51fa4ddfb07ad"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.935687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"140f781797bfadbb905fa57351522884427bbf6f5f7df4cc6c210e4bf8aa60dd"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.939662 4790 generic.go:334] "Generic (PLEG): container finished" podID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerID="835625bd5b25d33532e6f1a4c1701e10e108eacd1b2af7b16c3e421ede1a0acf" exitCode=0 Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.939792 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"09cce78a-6bee-4201-82d7-a4e0dd041c9f","Type":"ContainerDied","Data":"835625bd5b25d33532e6f1a4c1701e10e108eacd1b2af7b16c3e421ede1a0acf"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.947666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"8f68ac2f7863be3453e9957cc4344e636b513ffff4577dc51f946700572b6683"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.947719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f121558a4493e216aae0979d701d802330aaa8cc4d4d43b6a5a8d79a4a7f6c7d"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.952531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"35442addd7079011ecf48aca388391d4e0f716842b0ac5563911482db0c7ab7d"} Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.971577 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:31:23 crc kubenswrapper[4790]: I0313 20:31:23.972193 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:23 crc kubenswrapper[4790]: E0313 20:31:23.978533 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.478499133 +0000 UTC m=+215.499615024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.073667 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.076936 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.5769204 +0000 UTC m=+215.598036291 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.179227 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.179649 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.679629915 +0000 UTC m=+215.700745806 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.281228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.281635 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.781619449 +0000 UTC m=+215.802735340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.383087 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.383277 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.883247133 +0000 UTC m=+215.904363024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.383358 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.383734 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.883718846 +0000 UTC m=+215.904834737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.484206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.484364 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.984345644 +0000 UTC m=+216.005461535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.484477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.484748 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:24.984739485 +0000 UTC m=+216.005855376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.529085 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.530110 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.532059 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.532991 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.552889 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-x7zgr" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.555894 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585538 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.585927 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.085909127 +0000 UTC m=+216.107025018 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.585987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687158 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687277 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687340 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687392 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.687897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.689735 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.690058 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.19004298 +0000 UTC m=+216.211158971 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.806601 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:24 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:24 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:24 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.806659 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.809213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.809601 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.309585399 +0000 UTC m=+216.330701300 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.811059 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.812534 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.815552 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.815772 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.831075 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"community-operators-672cv\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.853591 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911518 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911654 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.911676 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:24 crc kubenswrapper[4790]: E0313 20:31:24.911976 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.411960945 +0000 UTC m=+216.433076826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.929343 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.933253 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.948397 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:31:24 crc kubenswrapper[4790]: I0313 20:31:24.973533 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34076: no serving certificate available for the kubelet" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.012919 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.013143 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.513086035 +0000 UTC m=+216.534201936 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013572 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013607 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013698 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.013817 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.014325 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.514306149 +0000 UTC m=+216.535422140 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.017608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.017885 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.041249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"certified-operators-txx64\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114767 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114924 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114951 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.114979 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.116217 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.616202401 +0000 UTC m=+216.637318292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.137867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.142006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.144323 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.148736 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.150529 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.151927 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"community-operators-5tr4n\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.186815 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.187089 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" containerID="cri-o://b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83" gracePeriod=30 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.199752 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.233230 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.233721 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.733707126 +0000 UTC m=+216.754823017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.244088 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.244364 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" containerID="cri-o://b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d" gracePeriod=30 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.259768 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334949 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.334966 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.337618 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.837597232 +0000 UTC m=+216.858713123 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.373208 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436433 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436726 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436783 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.436812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.437556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.437888 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:25.93787347 +0000 UTC m=+216.958989361 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.438336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.468226 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"certified-operators-df8gv\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.498832 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.509607 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.537977 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538029 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") pod \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538097 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") pod \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\" (UID: \"09cce78a-6bee-4201-82d7-a4e0dd041c9f\") " Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "09cce78a-6bee-4201-82d7-a4e0dd041c9f" (UID: "09cce78a-6bee-4201-82d7-a4e0dd041c9f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.538324 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.038307442 +0000 UTC m=+217.059423333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.538520 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.538817 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.038805186 +0000 UTC m=+217.059921077 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.542148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "09cce78a-6bee-4201-82d7-a4e0dd041c9f" (UID: "09cce78a-6bee-4201-82d7-a4e0dd041c9f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.646600 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.647186 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.1470989 +0000 UTC m=+217.168214801 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.647299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.647955 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/09cce78a-6bee-4201-82d7-a4e0dd041c9f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.648769 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.148686344 +0000 UTC m=+217.169802235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.680062 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.709810 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:25 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:25 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:25 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.709856 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.750952 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.751356 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.251331077 +0000 UTC m=+217.272446968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.856157 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.856766 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.356750874 +0000 UTC m=+217.377866765 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.958730 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:25 crc kubenswrapper[4790]: E0313 20:31:25.959152 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.459132759 +0000 UTC m=+217.480248650 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.962479 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:31:25 crc kubenswrapper[4790]: W0313 20:31:25.977710 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda03af74_8c59_4ccf_aff8_03dc6303e322.slice/crio-9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760 WatchSource:0}: Error finding container 9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760: Status 404 returned error can't find the container with id 9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.979130 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.979225 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"09cce78a-6bee-4201-82d7-a4e0dd041c9f","Type":"ContainerDied","Data":"78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.979253 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78a1bbc4c5af6c1cce5a2bc6069daf0b91593a873ea629e68a80642ae7614598" Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.984023 4790 generic.go:334] "Generic (PLEG): container finished" podID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerID="b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83" exitCode=0 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.984091 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerDied","Data":"b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.991827 4790 generic.go:334] "Generic (PLEG): container finished" podID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" exitCode=0 Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.991902 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.991929 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerStarted","Data":"073e407a9eaa46913e8a833719c1712b0b191e45db2255328c3b799329f32f02"} Mar 13 20:31:25 crc kubenswrapper[4790]: I0313 20:31:25.996949 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mnf26" event={"ID":"c54336a0-5a12-4bf9-9807-337dd352fdb6","Type":"ContainerStarted","Data":"a709e96657b309a9665690d02b5c7d5c73211093a04a9611a54e061de11d9da2"} Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.010682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerStarted","Data":"5bff08277bee799461658bd86530c13fa744a49d2daab25cbda9f9c23ac16aa2"} Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.018610 4790 generic.go:334] "Generic (PLEG): container finished" podID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerID="b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d" exitCode=0 Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.018651 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerDied","Data":"b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d"} Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.034461 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mnf26" podStartSLOduration=171.03444391 podStartE2EDuration="2m51.03444391s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:26.032838647 +0000 UTC m=+217.053954538" watchObservedRunningTime="2026-03-13 20:31:26.03444391 +0000 UTC m=+217.055559801" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.054231 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.060159 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.061193 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.561179935 +0000 UTC m=+217.582295826 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.154150 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-rmlmp" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.162535 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.163218 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.66318994 +0000 UTC m=+217.684305831 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.191472 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.193981 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.193911 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.194231 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.256016 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264667 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264843 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.264863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") pod \"869d7601-27fe-4a6a-840b-a9811c4d1e06\" (UID: \"869d7601-27fe-4a6a-840b-a9811c4d1e06\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.265050 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.268323 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.768308119 +0000 UTC m=+217.789424010 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.272898 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca" (OuterVolumeSpecName: "client-ca") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.273504 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config" (OuterVolumeSpecName: "config") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.274031 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.278841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx" (OuterVolumeSpecName: "kube-api-access-fbpwx") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "kube-api-access-fbpwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.280038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "869d7601-27fe-4a6a-840b-a9811c4d1e06" (UID: "869d7601-27fe-4a6a-840b-a9811c4d1e06"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.295182 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.356632 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.357966 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.358040 4790 patch_prober.go:28] interesting pod/console-f9d7485db-q5j7f container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.358094 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-q5j7f" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" probeResult="failure" output="Get \"https://10.217.0.23:8443/health\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366067 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366176 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366213 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.366244 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.866218913 +0000 UTC m=+217.887334834 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366343 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") pod \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\" (UID: \"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366822 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366842 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366852 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbpwx\" (UniqueName: \"kubernetes.io/projected/869d7601-27fe-4a6a-840b-a9811c4d1e06-kube-api-access-fbpwx\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366861 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/869d7601-27fe-4a6a-840b-a9811c4d1e06-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.366870 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/869d7601-27fe-4a6a-840b-a9811c4d1e06-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.367844 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.867834687 +0000 UTC m=+217.888950578 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.368118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca" (OuterVolumeSpecName: "client-ca") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.368216 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config" (OuterVolumeSpecName: "config") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.371821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl" (OuterVolumeSpecName: "kube-api-access-k6xvl") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "kube-api-access-k6xvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.371940 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" (UID: "1f45edb0-2914-47c2-82f3-a0f5a99fe9e9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.375663 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.376530 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.386536 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468006 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.468306 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:26.96828738 +0000 UTC m=+217.989403271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468483 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468507 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6xvl\" (UniqueName: \"kubernetes.io/projected/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-kube-api-access-k6xvl\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468519 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.468531 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.569647 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.570010 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.069991847 +0000 UTC m=+218.091107748 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.675199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.675572 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.175545308 +0000 UTC m=+218.196661229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.701357 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.707634 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:26 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:26 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:26 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.707698 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.730840 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.731052 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerName="pruner" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731065 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerName="pruner" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.731081 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731088 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.731102 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731110 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731191 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09cce78a-6bee-4201-82d7-a4e0dd041c9f" containerName="pruner" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731204 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" containerName="route-controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731213 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" containerName="controller-manager" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.731917 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.734040 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.745048 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.776895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.778054 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.278041696 +0000 UTC m=+218.299157577 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.877935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.878292 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.378256833 +0000 UTC m=+218.399372724 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878505 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878534 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878578 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.878635 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.878909 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.37889799 +0000 UTC m=+218.400013871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979329 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979507 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.979535 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.479509647 +0000 UTC m=+218.500625538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979582 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.979717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:26 crc kubenswrapper[4790]: E0313 20:31:26.980200 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.480184555 +0000 UTC m=+218.501300446 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.980247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:26 crc kubenswrapper[4790]: I0313 20:31:26.980288 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.016255 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"redhat-marketplace-bq4pj\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.034659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"aa535420608f709689b7d99354c2ca7e3de7feac56c2bea20ab5aa4a2cb8cb0d"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.041208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" event={"ID":"869d7601-27fe-4a6a-840b-a9811c4d1e06","Type":"ContainerDied","Data":"261ca998108ed493dc900955a8fd9a4c77b099c17c3446f5d7d42417ca41db4e"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.041276 4790 scope.go:117] "RemoveContainer" containerID="b3f64a80f53b3463abb2e75cb2ad8094df85b77279ffcd7d0508ada4f6f68f83" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.041346 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zsqd7" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.045821 4790 generic.go:334] "Generic (PLEG): container finished" podID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" exitCode=0 Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.045894 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.048658 4790 generic.go:334] "Generic (PLEG): container finished" podID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerID="33326be198fd78688d8c0e82df3982727cfbc7e94ef4969d1503af495b1859ed" exitCode=0 Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.048802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"33326be198fd78688d8c0e82df3982727cfbc7e94ef4969d1503af495b1859ed"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.048868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerStarted","Data":"1f3bbc4d7d37e2d400e1366f116e79095d38ddf23a471dd30cc3d7e41c04740d"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.059139 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" event={"ID":"1f45edb0-2914-47c2-82f3-a0f5a99fe9e9","Type":"ContainerDied","Data":"99cf4ef26fb9eb5a3a40ad496b60c26b191859906bd206806ca175b1e727b6b2"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.059179 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.066361 4790 generic.go:334] "Generic (PLEG): container finished" podID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerID="18d45729b57b0625b6ac059bc91aedd72d39045472cf08d5152f47c470f71f43" exitCode=0 Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.066731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"18d45729b57b0625b6ac059bc91aedd72d39045472cf08d5152f47c470f71f43"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.066762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerStarted","Data":"9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760"} Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.078955 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.080220 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-tvv7w" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.087565 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.088665 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.588636155 +0000 UTC m=+218.609752066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.103773 4790 scope.go:117] "RemoveContainer" containerID="b0fb5457a9676ea9d3a55511a014a0d139b4e8575ca4d1d1a0534aae99f0076d" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.109458 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.124498 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zsqd7"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.127770 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.131037 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.160592 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.201957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.203370 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.703357765 +0000 UTC m=+218.724473646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.209718 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.246502 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ftx7g"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.253695 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.254365 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.257997 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.259068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.259996 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260100 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260488 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260514 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.260963 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.261085 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.287512 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288016 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288573 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.288896 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.289285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.289550 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.306776 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.307100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.307139 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.307285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.307457 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.807437776 +0000 UTC m=+218.828553667 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.315064 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.315439 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410601 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410631 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410670 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410771 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410827 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410847 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410908 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.410938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.412069 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.412573 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:27.912562876 +0000 UTC m=+218.933678767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.412932 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.449146 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"redhat-marketplace-mf4tm\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.482007 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.493963 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.495201 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.498681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.498904 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.512907 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.512994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513021 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513048 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513066 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513125 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513156 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513175 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513195 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.513209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.513470 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.013454971 +0000 UTC m=+219.034570852 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.516186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.517553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.518607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.519556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.519805 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.522524 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.528185 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.553258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"controller-manager-8687f458cd-h5svs\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.555246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.555927 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34086: no serving certificate available for the kubelet" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.561242 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"route-controller-manager-69fc968766-v5gfg\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.601776 4790 ???:1] "http: TLS handshake error from 192.168.126.11:34100: no serving certificate available for the kubelet" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.617952 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.618099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.618233 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.619067 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.119049542 +0000 UTC m=+219.140165443 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.619503 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.660936 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.673174 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f45edb0-2914-47c2-82f3-a0f5a99fe9e9" path="/var/lib/kubelet/pods/1f45edb0-2914-47c2-82f3-a0f5a99fe9e9/volumes" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.674215 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="869d7601-27fe-4a6a-840b-a9811c4d1e06" path="/var/lib/kubelet/pods/869d7601-27fe-4a6a-840b-a9811c4d1e06/volumes" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.685370 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.708614 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:27 crc kubenswrapper[4790]: [-]has-synced failed: reason withheld Mar 13 20:31:27 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:27 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.708666 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.719200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.719622 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.219602538 +0000 UTC m=+219.240718429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.722467 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.737434 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.740929 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.742846 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.747641 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.757305 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821537 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821648 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.821707 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.822106 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.322091616 +0000 UTC m=+219.343207507 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.837516 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.922679 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923177 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923270 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923296 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.923715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: E0313 20:31:27.924073 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.42405571 +0000 UTC m=+219.445171611 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.924362 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:27 crc kubenswrapper[4790]: I0313 20:31:27.952077 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"redhat-operators-hnd2l\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.026120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.026490 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.526476106 +0000 UTC m=+219.547591997 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.053055 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.054790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.074547 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.082217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"e381ee8bd664d5619b0e6d2a4c827fec9aa9a14fcff37d85a821534359f1ff27"} Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.093564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerStarted","Data":"48ce1cd0515d2f72905d7c3b45c89c2baec4ecf2f36741a13ea570b7bf830ee2"} Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.123277 4790 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.130925 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.132214 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.632188752 +0000 UTC m=+219.653304683 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.134497 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.135787 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.147410 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232252 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.232316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.232658 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.732645174 +0000 UTC m=+219.753761135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.243933 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.333969 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.334544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.334573 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.334651 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.335339 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.335635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.336196 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.83617575 +0000 UTC m=+219.857291661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.394591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"redhat-operators-fxjp7\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.436014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.436323 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:28.936311775 +0000 UTC m=+219.957427666 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.509761 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.541194 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.541561 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.041542518 +0000 UTC m=+220.062658409 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.557680 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.643494 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.643888 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.143872671 +0000 UTC m=+220.164988562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.663880 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.711011 4790 patch_prober.go:28] interesting pod/router-default-5444994796-pzx4q container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 20:31:28 crc kubenswrapper[4790]: [+]has-synced ok Mar 13 20:31:28 crc kubenswrapper[4790]: [+]process-running ok Mar 13 20:31:28 crc kubenswrapper[4790]: healthz check failed Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.711074 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-pzx4q" podUID="658b4bb6-837c-48ed-b5f3-aa30bd1e9740" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.746242 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.747152 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.2471177 +0000 UTC m=+220.268233601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.849422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:28 crc kubenswrapper[4790]: E0313 20:31:28.850473 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-13 20:31:29.350453141 +0000 UTC m=+220.371569032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-vqdfm" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.904808 4790 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-13T20:31:28.12331277Z","Handler":null,"Name":""} Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.905926 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.907941 4790 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.907977 4790 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.950659 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 13 20:31:28 crc kubenswrapper[4790]: W0313 20:31:28.990093 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4aa0c26b_aef8_49e9_9904_da9e8d029c9d.slice/crio-050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa WatchSource:0}: Error finding container 050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa: Status 404 returned error can't find the container with id 050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa Mar 13 20:31:28 crc kubenswrapper[4790]: I0313 20:31:28.993055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.052309 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.055218 4790 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.055265 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.137232 4790 generic.go:334] "Generic (PLEG): container finished" podID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerID="93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a" exitCode=0 Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.137318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerDied","Data":"93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.139112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerStarted","Data":"050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.143262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" event={"ID":"9e6c6344-8059-43d7-97be-273d115b8471","Type":"ContainerStarted","Data":"eaace0a3c5e5a3b8c9c6cbe3e7cb57115efee8da33f517b47b08d24616047a6a"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.144816 4790 generic.go:334] "Generic (PLEG): container finished" podID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerID="81d571ea6f444235cc217ca2f76bd3ade803e952dcea7fa197b363c62b207fc9" exitCode=0 Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.144883 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"81d571ea6f444235cc217ca2f76bd3ade803e952dcea7fa197b363c62b207fc9"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.149548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-vqdfm\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.151840 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerStarted","Data":"c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.151890 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerStarted","Data":"db98d9a2f6fdbbe8abd9aaa1bfbfc3ef07a5ef170be435e5a3e26c5a2b07958a"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.168117 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.169556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerStarted","Data":"fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.169598 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerStarted","Data":"e64961300406621b1586951d2f5b3f6e49675a9edac316b75dde99552bd4b189"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.172548 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.173729 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerID="afed47472efd96d5fb96f1be65a82143aad59afc7569141f603e4362a1d44b0e" exitCode=0 Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.173784 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"afed47472efd96d5fb96f1be65a82143aad59afc7569141f603e4362a1d44b0e"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.173803 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerStarted","Data":"53cd4a75ebfee1686f2db1e566581c31a9b03470b4313025f3a980087eb27a00"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.175969 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1ce5c74b-2f06-4910-92b5-54abaa46ab8b","Type":"ContainerStarted","Data":"6aa52f928dfcf20043e1e8907fdf7b8ac7fc0fee297941db822b3adf39ff7e52"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.177585 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerStarted","Data":"5433561752fd3b8f83751ddd33926ccfe479acc64fdf830adcad528290d813de"} Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.209814 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podStartSLOduration=4.209797631 podStartE2EDuration="4.209797631s" podCreationTimestamp="2026-03-13 20:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:29.208556927 +0000 UTC m=+220.229672818" watchObservedRunningTime="2026-03-13 20:31:29.209797631 +0000 UTC m=+220.230913522" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.252207 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.291523 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-jw27w" podStartSLOduration=16.291502775 podStartE2EDuration="16.291502775s" podCreationTimestamp="2026-03-13 20:31:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:29.286803008 +0000 UTC m=+220.307918899" watchObservedRunningTime="2026-03-13 20:31:29.291502775 +0000 UTC m=+220.312618666" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.342550 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podStartSLOduration=4.342532019 podStartE2EDuration="4.342532019s" podCreationTimestamp="2026-03-13 20:31:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:29.341625404 +0000 UTC m=+220.362741295" watchObservedRunningTime="2026-03-13 20:31:29.342532019 +0000 UTC m=+220.363647910" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.456176 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.459707 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.722451 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.749245 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.758169 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-pzx4q" Mar 13 20:31:29 crc kubenswrapper[4790]: I0313 20:31:29.839096 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:31:29 crc kubenswrapper[4790]: W0313 20:31:29.841991 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81949470_5c0d_4294_8618_d6ee14da1d41.slice/crio-403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6 WatchSource:0}: Error finding container 403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6: Status 404 returned error can't find the container with id 403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.185506 4790 generic.go:334] "Generic (PLEG): container finished" podID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerID="baf477fae6f589d57e0060deeedd270c70f1976e7a6063c5a8ff425709bdf2d2" exitCode=0 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.185590 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1ce5c74b-2f06-4910-92b5-54abaa46ab8b","Type":"ContainerDied","Data":"baf477fae6f589d57e0060deeedd270c70f1976e7a6063c5a8ff425709bdf2d2"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.186412 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerStarted","Data":"403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.187712 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerID="f276b163ccc0d21403b49d02b3c506a94213d0bcc943d5fcede3603bc020ebfc" exitCode=0 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.187984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"f276b163ccc0d21403b49d02b3c506a94213d0bcc943d5fcede3603bc020ebfc"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.191872 4790 generic.go:334] "Generic (PLEG): container finished" podID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerID="4ccfbd25425ce912c32c0f73aa49b376929e5a036b5718d87d565520eab1f4ab" exitCode=0 Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.192010 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"4ccfbd25425ce912c32c0f73aa49b376929e5a036b5718d87d565520eab1f4ab"} Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.506091 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.693920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") pod \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.694028 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") pod \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.694122 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") pod \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\" (UID: \"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51\") " Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.694911 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume" (OuterVolumeSpecName: "config-volume") pod "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" (UID: "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.705166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7" (OuterVolumeSpecName: "kube-api-access-d66s7") pod "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" (UID: "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51"). InnerVolumeSpecName "kube-api-access-d66s7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.705216 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" (UID: "87e4f09f-d19e-4b0a-85b2-636b5ce5ef51"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.795426 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.795460 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:30 crc kubenswrapper[4790]: I0313 20:31:30.795470 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d66s7\" (UniqueName: \"kubernetes.io/projected/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51-kube-api-access-d66s7\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.206856 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" event={"ID":"87e4f09f-d19e-4b0a-85b2-636b5ce5ef51","Type":"ContainerDied","Data":"1809f43b88080170a440a364505c4febd360a062e9e4aabd772262f808d67b1c"} Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.206897 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.206900 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1809f43b88080170a440a364505c4febd360a062e9e4aabd772262f808d67b1c" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.210547 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerStarted","Data":"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb"} Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.210683 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.253793 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" podStartSLOduration=176.253776124 podStartE2EDuration="2m56.253776124s" podCreationTimestamp="2026-03-13 20:28:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:31:31.251026919 +0000 UTC m=+222.272142810" watchObservedRunningTime="2026-03-13 20:31:31.253776124 +0000 UTC m=+222.274892015" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.585088 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-zwfns" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.586596 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.711222 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") pod \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.711304 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") pod \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\" (UID: \"1ce5c74b-2f06-4910-92b5-54abaa46ab8b\") " Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.712146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1ce5c74b-2f06-4910-92b5-54abaa46ab8b" (UID: "1ce5c74b-2f06-4910-92b5-54abaa46ab8b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.724427 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1ce5c74b-2f06-4910-92b5-54abaa46ab8b" (UID: "1ce5c74b-2f06-4910-92b5-54abaa46ab8b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.812751 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:31 crc kubenswrapper[4790]: I0313 20:31:31.812788 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1ce5c74b-2f06-4910-92b5-54abaa46ab8b-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.218416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"1ce5c74b-2f06-4910-92b5-54abaa46ab8b","Type":"ContainerDied","Data":"6aa52f928dfcf20043e1e8907fdf7b8ac7fc0fee297941db822b3adf39ff7e52"} Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.218467 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6aa52f928dfcf20043e1e8907fdf7b8ac7fc0fee297941db822b3adf39ff7e52" Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.218441 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 13 20:31:32 crc kubenswrapper[4790]: I0313 20:31:32.709589 4790 ???:1] "http: TLS handshake error from 192.168.126.11:36548: no serving certificate available for the kubelet" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191406 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191425 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191465 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.191470 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.363251 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:36 crc kubenswrapper[4790]: I0313 20:31:36.374887 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.017565 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.017921 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.791064 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.791304 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" containerID="cri-o://c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618" gracePeriod=30 Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.810026 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:31:44 crc kubenswrapper[4790]: I0313 20:31:44.810276 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" containerID="cri-o://fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed" gracePeriod=30 Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.712492 4790 generic.go:334] "Generic (PLEG): container finished" podID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerID="c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618" exitCode=0 Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.712595 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerDied","Data":"c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618"} Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.714553 4790 generic.go:334] "Generic (PLEG): container finished" podID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerID="fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed" exitCode=0 Mar 13 20:31:45 crc kubenswrapper[4790]: I0313 20:31:45.714605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerDied","Data":"fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed"} Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191634 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191715 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191651 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.191820 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.192351 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.192510 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.193530 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac"} pod="openshift-console/downloads-7954f5f757-zfhhl" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 13 20:31:46 crc kubenswrapper[4790]: I0313 20:31:46.193582 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" containerID="cri-o://e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac" gracePeriod=2 Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.686681 4790 patch_prober.go:28] interesting pod/route-controller-manager-69fc968766-v5gfg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.687013 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.727732 4790 generic.go:334] "Generic (PLEG): container finished" podID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerID="e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac" exitCode=0 Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.727782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerDied","Data":"e9bbd363611d3d25a3b6940c0c5a363cbf07f241be6299b10534167899b2bdac"} Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.742534 4790 patch_prober.go:28] interesting pod/controller-manager-8687f458cd-h5svs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 13 20:31:47 crc kubenswrapper[4790]: I0313 20:31:47.742603 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 13 20:31:49 crc kubenswrapper[4790]: I0313 20:31:49.467369 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:31:53 crc kubenswrapper[4790]: I0313 20:31:53.220833 4790 ???:1] "http: TLS handshake error from 192.168.126.11:49802: no serving certificate available for the kubelet" Mar 13 20:31:56 crc kubenswrapper[4790]: I0313 20:31:56.190994 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:31:56 crc kubenswrapper[4790]: I0313 20:31:56.191331 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:31:56 crc kubenswrapper[4790]: I0313 20:31:56.390807 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-cszm6" Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.687288 4790 patch_prober.go:28] interesting pod/route-controller-manager-69fc968766-v5gfg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" start-of-body= Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.687640 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": dial tcp 10.217.0.52:8443: connect: connection refused" Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.742210 4790 patch_prober.go:28] interesting pod/controller-manager-8687f458cd-h5svs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" start-of-body= Mar 13 20:31:57 crc kubenswrapper[4790]: I0313 20:31:57.742265 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": dial tcp 10.217.0.53:8443: connect: connection refused" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471060 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 20:31:58 crc kubenswrapper[4790]: E0313 20:31:58.471283 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerName="pruner" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471294 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerName="pruner" Mar 13 20:31:58 crc kubenswrapper[4790]: E0313 20:31:58.471305 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerName="collect-profiles" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471310 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerName="collect-profiles" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471420 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" containerName="collect-profiles" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.471433 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ce5c74b-2f06-4910-92b5-54abaa46ab8b" containerName="pruner" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.472350 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.474705 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.475893 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.484077 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.514321 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.514446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.615451 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.615543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.615562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.634564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:31:58 crc kubenswrapper[4790]: I0313 20:31:58.794854 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.132737 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.134100 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.136214 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.140139 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.234739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"auto-csr-approver-29557232-bblq8\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.335975 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"auto-csr-approver-29557232-bblq8\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.354481 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"auto-csr-approver-29557232-bblq8\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:00 crc kubenswrapper[4790]: I0313 20:32:00.449613 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:01 crc kubenswrapper[4790]: I0313 20:32:01.918467 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.467440 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.468077 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.487751 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.565925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.566251 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.566479 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.667914 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.667981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.668006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.668028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.668067 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.688224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"installer-9-crc\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:02 crc kubenswrapper[4790]: I0313 20:32:02.795888 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:32:06 crc kubenswrapper[4790]: I0313 20:32:06.192212 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:06 crc kubenswrapper[4790]: I0313 20:32:06.192542 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.686789 4790 patch_prober.go:28] interesting pod/route-controller-manager-69fc968766-v5gfg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.687102 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.52:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.742743 4790 patch_prober.go:28] interesting pod/controller-manager-8687f458cd-h5svs container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:32:08 crc kubenswrapper[4790]: I0313 20:32:08.742843 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.53:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:32:12 crc kubenswrapper[4790]: E0313 20:32:12.153173 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage238052257/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 20:32:12 crc kubenswrapper[4790]: E0313 20:32:12.153692 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xbqp8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bq4pj_openshift-marketplace(e17d5bd1-f368-47a4-80cb-3bd3eb4b822c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage238052257/2\": happened during read: context canceled" logger="UnhandledError" Mar 13 20:32:12 crc kubenswrapper[4790]: E0313 20:32:12.154891 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage238052257/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bq4pj" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.016885 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.016943 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.016986 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.017961 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.018019 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400" gracePeriod=600 Mar 13 20:32:14 crc kubenswrapper[4790]: E0313 20:32:14.884338 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bq4pj" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.924548 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.929032 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954245 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954323 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954401 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954454 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954484 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") pod \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\" (UID: \"ff385bac-0b93-4dc8-b8bc-ef1b4986649b\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954559 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.954609 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") pod \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\" (UID: \"04038bbe-4cc0-4d19-80d7-f86cdffda1d5\") " Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.955563 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca" (OuterVolumeSpecName: "client-ca") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.958831 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config" (OuterVolumeSpecName: "config") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.959854 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca" (OuterVolumeSpecName: "client-ca") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.960288 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.962565 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26" (OuterVolumeSpecName: "kube-api-access-cbn26") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "kube-api-access-cbn26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.962649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config" (OuterVolumeSpecName: "config") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.962884 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "04038bbe-4cc0-4d19-80d7-f86cdffda1d5" (UID: "04038bbe-4cc0-4d19-80d7-f86cdffda1d5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.963986 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:14 crc kubenswrapper[4790]: E0313 20:32:14.964261 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964273 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: E0313 20:32:14.964283 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964290 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964483 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.964936 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" containerName="route-controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.965032 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" containerName="controller-manager" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.966724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.967957 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5" (OuterVolumeSpecName: "kube-api-access-797h5") pod "ff385bac-0b93-4dc8-b8bc-ef1b4986649b" (UID: "ff385bac-0b93-4dc8-b8bc-ef1b4986649b"). InnerVolumeSpecName "kube-api-access-797h5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:14 crc kubenswrapper[4790]: I0313 20:32:14.976652 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055701 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055958 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055973 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055986 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.055995 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-797h5\" (UniqueName: \"kubernetes.io/projected/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-kube-api-access-797h5\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056004 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056012 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056021 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff385bac-0b93-4dc8-b8bc-ef1b4986649b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056029 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbn26\" (UniqueName: \"kubernetes.io/projected/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-kube-api-access-cbn26\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.056036 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/04038bbe-4cc0-4d19-80d7-f86cdffda1d5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.057632 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" event={"ID":"04038bbe-4cc0-4d19-80d7-f86cdffda1d5","Type":"ContainerDied","Data":"db98d9a2f6fdbbe8abd9aaa1bfbfc3ef07a5ef170be435e5a3e26c5a2b07958a"} Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.057665 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-8687f458cd-h5svs" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.057690 4790 scope.go:117] "RemoveContainer" containerID="c582f7273eb48a0199c8d7ed2bdfea28605189fac5c66a90356664d5f29d8618" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.060899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" event={"ID":"ff385bac-0b93-4dc8-b8bc-ef1b4986649b","Type":"ContainerDied","Data":"e64961300406621b1586951d2f5b3f6e49675a9edac316b75dde99552bd4b189"} Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.061002 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.064493 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400" exitCode=0 Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.064525 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400"} Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.090759 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.093299 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-8687f458cd-h5svs"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.101116 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.104281 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69fc968766-v5gfg"] Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156706 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156854 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.156885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.157886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.158097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.158186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.172476 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.180649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"controller-manager-578f7cc4b8-ngnwx\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.312703 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.666778 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04038bbe-4cc0-4d19-80d7-f86cdffda1d5" path="/var/lib/kubelet/pods/04038bbe-4cc0-4d19-80d7-f86cdffda1d5/volumes" Mar 13 20:32:15 crc kubenswrapper[4790]: I0313 20:32:15.667476 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff385bac-0b93-4dc8-b8bc-ef1b4986649b" path="/var/lib/kubelet/pods/ff385bac-0b93-4dc8-b8bc-ef1b4986649b/volumes" Mar 13 20:32:16 crc kubenswrapper[4790]: I0313 20:32:16.192164 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:16 crc kubenswrapper[4790]: I0313 20:32:16.192261 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.684631 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.685769 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.687965 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.687999 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.688143 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.688824 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.690262 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.691371 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.694071 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791254 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791305 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.791405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892353 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.892510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.893485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.894022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.901396 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:17 crc kubenswrapper[4790]: I0313 20:32:17.908867 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"route-controller-manager-67995dc89c-q5mcq\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:18 crc kubenswrapper[4790]: I0313 20:32:18.043015 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.105159 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\": context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.105957 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x5fzv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mf4tm_openshift-marketplace(f1be7d98-ff3a-42bb-b8ff-4001814ae453): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\": context canceled" logger="UnhandledError" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.107245 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e: Get \\\"https://registry.redhat.io/v2/redhat/redhat-marketplace-index/blobs/sha256:1cbc3fa0429b1e7ccd7344896a786f490a69cd57258c89894900d0f00ccac64e\\\": context canceled\"" pod="openshift-marketplace/redhat-marketplace-mf4tm" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.387983 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.388152 4790 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 20:32:20 crc kubenswrapper[4790]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 13 20:32:20 crc kubenswrapper[4790]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hb9zk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29557230-8pqh8_openshift-infra(d598b7c0-7c77-4903-9138-d8a3d01f9efe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 13 20:32:20 crc kubenswrapper[4790]: > logger="UnhandledError" Mar 13 20:32:20 crc kubenswrapper[4790]: E0313 20:32:20.389398 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" Mar 13 20:32:21 crc kubenswrapper[4790]: E0313 20:32:21.100596 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" Mar 13 20:32:21 crc kubenswrapper[4790]: E0313 20:32:21.649305 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mf4tm" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" Mar 13 20:32:23 crc kubenswrapper[4790]: E0313 20:32:23.259720 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 20:32:23 crc kubenswrapper[4790]: E0313 20:32:23.260438 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hskct,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-txx64_openshift-marketplace(7080e6b3-5934-4c2c-9361-23d20b5a495e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:23 crc kubenswrapper[4790]: E0313 20:32:23.261656 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-txx64" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" Mar 13 20:32:26 crc kubenswrapper[4790]: I0313 20:32:26.191282 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:26 crc kubenswrapper[4790]: I0313 20:32:26.191628 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.385540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-txx64" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.455674 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.455828 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vhj8f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-fxjp7_openshift-marketplace(4aa0c26b-aef8-49e9-9904-da9e8d029c9d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.457566 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.480312 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.480463 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fwk57,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-df8gv_openshift-marketplace(da03af74-8c59-4ccf-aff8-03dc6303e322): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:26 crc kubenswrapper[4790]: E0313 20:32:26.481761 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-df8gv" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.162420 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-df8gv" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.171072 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.211759 4790 scope.go:117] "RemoveContainer" containerID="fe3a1b339d4ab84389f1307367b2bebb3168e6d421f121726f4d936f123537ed" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.249740 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.250676 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.250908 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zhkbp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-672cv_openshift-marketplace(dbee8a79-e625-49ef-8fcb-944341ae6e37): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.252135 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-672cv" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.253333 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4dmtw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-5tr4n_openshift-marketplace(446f0f4c-a97c-47d0-929d-0b99e07c8186): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 13 20:32:28 crc kubenswrapper[4790]: E0313 20:32:28.255006 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-5tr4n" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.456223 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.508161 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.747810 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.861969 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:28 crc kubenswrapper[4790]: I0313 20:32:28.868177 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:28 crc kubenswrapper[4790]: W0313 20:32:28.883870 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25fd28fa_57e3_41b6_8329_693cbfb20e89.slice/crio-d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d WatchSource:0}: Error finding container d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d: Status 404 returned error can't find the container with id d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.147444 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerStarted","Data":"541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.147816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerStarted","Data":"d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.149655 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.155187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-zfhhl" event={"ID":"6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150","Type":"ContainerStarted","Data":"cdca48635a5083c3a3adb08d1d13d2bc3dcf5e76b82c79cc4754522c8cfa7f45"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.156139 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.159344 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.159462 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.160531 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.162149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerStarted","Data":"9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.162182 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerStarted","Data":"6227ca10032f20f7061333e184a7f5fd825e11a98a57054176baad69903a3e6d"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.163246 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.181048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.184656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerStarted","Data":"73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.186423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerStarted","Data":"9f6af6894778163383ed6bc7ed4bee995281a37fdd644ab6915400ceabaa99c9"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.186488 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerStarted","Data":"7c45eb7619c8e10226f3c5dac1f003594c20c32d5deea5cafda395f8e88d886e"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.192404 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" podStartSLOduration=26.192362314 podStartE2EDuration="26.192362314s" podCreationTimestamp="2026-03-13 20:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.173019839 +0000 UTC m=+280.194135730" watchObservedRunningTime="2026-03-13 20:32:29.192362314 +0000 UTC m=+280.213478205" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.200185 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerStarted","Data":"a23eb85d97b1e4751bafccab0781c9447925014836e41ef0f17d54c7448721b2"} Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.215051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerStarted","Data":"efd1d06a6ce25e4e3fca34226ca853275cb494e1f7d417b592640cdbae34182e"} Mar 13 20:32:29 crc kubenswrapper[4790]: E0313 20:32:29.217858 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-672cv" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" Mar 13 20:32:29 crc kubenswrapper[4790]: E0313 20:32:29.218113 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-5tr4n" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.260592 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" podStartSLOduration=25.260571604 podStartE2EDuration="25.260571604s" podCreationTimestamp="2026-03-13 20:32:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.258297682 +0000 UTC m=+280.279413573" watchObservedRunningTime="2026-03-13 20:32:29.260571604 +0000 UTC m=+280.281687495" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.330614 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=27.330591834 podStartE2EDuration="27.330591834s" podCreationTimestamp="2026-03-13 20:32:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.327641714 +0000 UTC m=+280.348757615" watchObservedRunningTime="2026-03-13 20:32:29.330591834 +0000 UTC m=+280.351707725" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.385070 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=31.385046222 podStartE2EDuration="31.385046222s" podCreationTimestamp="2026-03-13 20:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:29.382820711 +0000 UTC m=+280.403936602" watchObservedRunningTime="2026-03-13 20:32:29.385046222 +0000 UTC m=+280.406162113" Mar 13 20:32:29 crc kubenswrapper[4790]: I0313 20:32:29.549061 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.225018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerStarted","Data":"7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.227347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerStarted","Data":"58f159651637d3217394d3f34d5549bae6158dd0fd270cdccfb0e48c45bc1c2d"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.229493 4790 generic.go:334] "Generic (PLEG): container finished" podID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerID="73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6" exitCode=0 Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.229572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.230730 4790 generic.go:334] "Generic (PLEG): container finished" podID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerID="9f6af6894778163383ed6bc7ed4bee995281a37fdd644ab6915400ceabaa99c9" exitCode=0 Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.231244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerDied","Data":"9f6af6894778163383ed6bc7ed4bee995281a37fdd644ab6915400ceabaa99c9"} Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.232908 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.232946 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.581527 4790 csr.go:261] certificate signing request csr-zhfvx is approved, waiting to be issued Mar 13 20:32:30 crc kubenswrapper[4790]: I0313 20:32:30.589508 4790 csr.go:257] certificate signing request csr-zhfvx is issued Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.241707 4790 generic.go:334] "Generic (PLEG): container finished" podID="b190462f-7836-44f0-94c0-1311bdf8e550" containerID="7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d" exitCode=0 Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.241775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerDied","Data":"7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d"} Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.243587 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.243641 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.568273 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.587974 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") pod \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.588073 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") pod \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\" (UID: \"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c\") " Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.589109 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" (UID: "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.594509 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 21:43:25.708006275 +0000 UTC Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.594775 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6481h10m54.113234091s for next certificate rotation Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.595771 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" (UID: "6c64aba6-6db6-4d23-91f9-9ba5f7b2373c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.689618 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:31 crc kubenswrapper[4790]: I0313 20:32:31.689868 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6c64aba6-6db6-4d23-91f9-9ba5f7b2373c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.248449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"6c64aba6-6db6-4d23-91f9-9ba5f7b2373c","Type":"ContainerDied","Data":"7c45eb7619c8e10226f3c5dac1f003594c20c32d5deea5cafda395f8e88d886e"} Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.248501 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c45eb7619c8e10226f3c5dac1f003594c20c32d5deea5cafda395f8e88d886e" Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.248464 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.595705 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-08 22:34:01.937259568 +0000 UTC Mar 13 20:32:32 crc kubenswrapper[4790]: I0313 20:32:32.596040 4790 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6482h1m29.341225409s for next certificate rotation Mar 13 20:32:33 crc kubenswrapper[4790]: I0313 20:32:33.917811 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.034225 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") pod \"b190462f-7836-44f0-94c0-1311bdf8e550\" (UID: \"b190462f-7836-44f0-94c0-1311bdf8e550\") " Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.041454 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz" (OuterVolumeSpecName: "kube-api-access-jpxhz") pod "b190462f-7836-44f0-94c0-1311bdf8e550" (UID: "b190462f-7836-44f0-94c0-1311bdf8e550"). InnerVolumeSpecName "kube-api-access-jpxhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.135548 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpxhz\" (UniqueName: \"kubernetes.io/projected/b190462f-7836-44f0-94c0-1311bdf8e550-kube-api-access-jpxhz\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.264090 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557232-bblq8" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.264712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557232-bblq8" event={"ID":"b190462f-7836-44f0-94c0-1311bdf8e550","Type":"ContainerDied","Data":"a23eb85d97b1e4751bafccab0781c9447925014836e41ef0f17d54c7448721b2"} Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.264767 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a23eb85d97b1e4751bafccab0781c9447925014836e41ef0f17d54c7448721b2" Mar 13 20:32:34 crc kubenswrapper[4790]: I0313 20:32:34.266958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerStarted","Data":"85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af"} Mar 13 20:32:35 crc kubenswrapper[4790]: I0313 20:32:35.296485 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-hnd2l" podStartSLOduration=4.6210440550000005 podStartE2EDuration="1m8.296467045s" podCreationTimestamp="2026-03-13 20:31:27 +0000 UTC" firstStartedPulling="2026-03-13 20:31:30.194206684 +0000 UTC m=+221.215322575" lastFinishedPulling="2026-03-13 20:32:33.869629674 +0000 UTC m=+284.890745565" observedRunningTime="2026-03-13 20:32:35.294767989 +0000 UTC m=+286.315883880" watchObservedRunningTime="2026-03-13 20:32:35.296467045 +0000 UTC m=+286.317582936" Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.190964 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.191449 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.190995 4790 patch_prober.go:28] interesting pod/downloads-7954f5f757-zfhhl container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.191566 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-zfhhl" podUID="6abee7d9-6de2-4bc0-8a59-b3e2b6fd3150" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 13 20:32:36 crc kubenswrapper[4790]: I0313 20:32:36.284748 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerStarted","Data":"fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206"} Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.299011 4790 generic.go:334] "Generic (PLEG): container finished" podID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerID="8f1a4232fe3ee20e22f3a57d7811b303dba4631c6cf2890a09449767842fc5b4" exitCode=0 Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.299332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" event={"ID":"d598b7c0-7c77-4903-9138-d8a3d01f9efe","Type":"ContainerDied","Data":"8f1a4232fe3ee20e22f3a57d7811b303dba4631c6cf2890a09449767842fc5b4"} Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.301021 4790 generic.go:334] "Generic (PLEG): container finished" podID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerID="fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206" exitCode=0 Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.301067 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206"} Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.304055 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerID="912e4880b7a24b954e780b6b21a914866c6b2e2fd8684cf3dc798b5f59ce287f" exitCode=0 Mar 13 20:32:37 crc kubenswrapper[4790]: I0313 20:32:37.304085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"912e4880b7a24b954e780b6b21a914866c6b2e2fd8684cf3dc798b5f59ce287f"} Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.075686 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.075810 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.779990 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.928276 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") pod \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\" (UID: \"d598b7c0-7c77-4903-9138-d8a3d01f9efe\") " Mar 13 20:32:38 crc kubenswrapper[4790]: I0313 20:32:38.934455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk" (OuterVolumeSpecName: "kube-api-access-hb9zk") pod "d598b7c0-7c77-4903-9138-d8a3d01f9efe" (UID: "d598b7c0-7c77-4903-9138-d8a3d01f9efe"). InnerVolumeSpecName "kube-api-access-hb9zk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.037396 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hb9zk\" (UniqueName: \"kubernetes.io/projected/d598b7c0-7c77-4903-9138-d8a3d01f9efe-kube-api-access-hb9zk\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.320559 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" event={"ID":"d598b7c0-7c77-4903-9138-d8a3d01f9efe","Type":"ContainerDied","Data":"3851738f410766329c5133a13a2bdd38c600122354cde8d6b4c645c3b69815b7"} Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.320878 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3851738f410766329c5133a13a2bdd38c600122354cde8d6b4c645c3b69815b7" Mar 13 20:32:39 crc kubenswrapper[4790]: I0313 20:32:39.320626 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557230-8pqh8" Mar 13 20:32:40 crc kubenswrapper[4790]: I0313 20:32:40.134861 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-hnd2l" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" probeResult="failure" output=< Mar 13 20:32:40 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:32:40 crc kubenswrapper[4790]: > Mar 13 20:32:43 crc kubenswrapper[4790]: I0313 20:32:43.966034 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:43 crc kubenswrapper[4790]: I0313 20:32:43.966804 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" containerID="cri-o://541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9" gracePeriod=30 Mar 13 20:32:44 crc kubenswrapper[4790]: I0313 20:32:44.181195 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:44 crc kubenswrapper[4790]: I0313 20:32:44.181476 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" containerID="cri-o://9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153" gracePeriod=30 Mar 13 20:32:44 crc kubenswrapper[4790]: I0313 20:32:44.346986 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerStarted","Data":"674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee"} Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.314983 4790 patch_prober.go:28] interesting pod/controller-manager-578f7cc4b8-ngnwx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.315956 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.352661 4790 generic.go:334] "Generic (PLEG): container finished" podID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerID="541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9" exitCode=0 Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.352713 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerDied","Data":"541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9"} Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.354369 4790 generic.go:334] "Generic (PLEG): container finished" podID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerID="9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153" exitCode=0 Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.354638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerDied","Data":"9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153"} Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.373288 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bq4pj" podStartSLOduration=5.39208067 podStartE2EDuration="1m19.373269219s" podCreationTimestamp="2026-03-13 20:31:26 +0000 UTC" firstStartedPulling="2026-03-13 20:31:29.147130772 +0000 UTC m=+220.168246663" lastFinishedPulling="2026-03-13 20:32:43.128319321 +0000 UTC m=+294.149435212" observedRunningTime="2026-03-13 20:32:45.370286198 +0000 UTC m=+296.391402089" watchObservedRunningTime="2026-03-13 20:32:45.373269219 +0000 UTC m=+296.394385110" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.925339 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947334 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947577 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerName="pruner" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947588 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerName="pruner" Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947604 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947610 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947622 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947628 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: E0313 20:32:45.947637 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947643 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947727 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947735 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" containerName="oc" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947745 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" containerName="controller-manager" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.947752 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c64aba6-6db6-4d23-91f9-9ba5f7b2373c" containerName="pruner" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.948087 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.968153 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:32:45 crc kubenswrapper[4790]: I0313 20:32:45.992456 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.079447 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080682 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080749 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080772 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") pod \"506dcf0c-8c65-486f-ac8d-e16ba9474095\" (UID: \"506dcf0c-8c65-486f-ac8d-e16ba9474095\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080824 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080858 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.080925 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") pod \"25fd28fa-57e3-41b6-8329-693cbfb20e89\" (UID: \"25fd28fa-57e3-41b6-8329-693cbfb20e89\") " Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081159 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081228 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081436 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config" (OuterVolumeSpecName: "config") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.081484 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca" (OuterVolumeSpecName: "client-ca") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.082035 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca" (OuterVolumeSpecName: "client-ca") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.082057 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.082334 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config" (OuterVolumeSpecName: "config") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085238 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085312 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf" (OuterVolumeSpecName: "kube-api-access-v86xf") pod "506dcf0c-8c65-486f-ac8d-e16ba9474095" (UID: "506dcf0c-8c65-486f-ac8d-e16ba9474095"). InnerVolumeSpecName "kube-api-access-v86xf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085329 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc" (OuterVolumeSpecName: "kube-api-access-kvnkc") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "kube-api-access-kvnkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.085616 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "25fd28fa-57e3-41b6-8329-693cbfb20e89" (UID: "25fd28fa-57e3-41b6-8329-693cbfb20e89"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182044 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182243 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182304 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182319 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25fd28fa-57e3-41b6-8329-693cbfb20e89-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182332 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvnkc\" (UniqueName: \"kubernetes.io/projected/25fd28fa-57e3-41b6-8329-693cbfb20e89-kube-api-access-kvnkc\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182344 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/506dcf0c-8c65-486f-ac8d-e16ba9474095-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182356 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182367 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182405 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/25fd28fa-57e3-41b6-8329-693cbfb20e89-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182423 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v86xf\" (UniqueName: \"kubernetes.io/projected/506dcf0c-8c65-486f-ac8d-e16ba9474095-kube-api-access-v86xf\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.182436 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/506dcf0c-8c65-486f-ac8d-e16ba9474095-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.184063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.184237 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.184496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.190080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.198287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"controller-manager-86f474f899-ksxbf\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.209777 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-zfhhl" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.291399 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.363086 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.363082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx" event={"ID":"25fd28fa-57e3-41b6-8329-693cbfb20e89","Type":"ContainerDied","Data":"d16751fe8d93ef90d78c72ad7048c33f0e61e8d759f5ea723ab7c391af56413d"} Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.363227 4790 scope.go:117] "RemoveContainer" containerID="541380d65713151c215e4663c0f030bcf539002fb9c48968a355c573726423c9" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.365038 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.364826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq" event={"ID":"506dcf0c-8c65-486f-ac8d-e16ba9474095","Type":"ContainerDied","Data":"6227ca10032f20f7061333e184a7f5fd825e11a98a57054176baad69903a3e6d"} Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.403985 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.408804 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-578f7cc4b8-ngnwx"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.409805 4790 scope.go:117] "RemoveContainer" containerID="9be66b8654608963a58b8257f6d910c9aaebfd7bf0c22829f9fb2eab27a9a153" Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.425599 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.429842 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-67995dc89c-q5mcq"] Mar 13 20:32:46 crc kubenswrapper[4790]: I0313 20:32:46.700570 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:32:46 crc kubenswrapper[4790]: W0313 20:32:46.709792 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677b8903_b2f7_437f_a96d_f72d1ed30de5.slice/crio-e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b WatchSource:0}: Error finding container e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b: Status 404 returned error can't find the container with id e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.081285 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.081338 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.372348 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.376422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerStarted","Data":"65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692"} Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.379972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerStarted","Data":"e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b"} Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.667915 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25fd28fa-57e3-41b6-8329-693cbfb20e89" path="/var/lib/kubelet/pods/25fd28fa-57e3-41b6-8329-693cbfb20e89/volumes" Mar 13 20:32:47 crc kubenswrapper[4790]: I0313 20:32:47.668567 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" path="/var/lib/kubelet/pods/506dcf0c-8c65-486f-ac8d-e16ba9474095/volumes" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.155491 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.197946 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.386149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerStarted","Data":"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6"} Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.386816 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.393954 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.409754 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mf4tm" podStartSLOduration=4.663511714 podStartE2EDuration="1m21.409737412s" podCreationTimestamp="2026-03-13 20:31:27 +0000 UTC" firstStartedPulling="2026-03-13 20:31:29.174766952 +0000 UTC m=+220.195882843" lastFinishedPulling="2026-03-13 20:32:45.92099265 +0000 UTC m=+296.942108541" observedRunningTime="2026-03-13 20:32:48.403742879 +0000 UTC m=+299.424858780" watchObservedRunningTime="2026-03-13 20:32:48.409737412 +0000 UTC m=+299.430853293" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.420837 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" podStartSLOduration=5.420821003 podStartE2EDuration="5.420821003s" podCreationTimestamp="2026-03-13 20:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:48.419154947 +0000 UTC m=+299.440270838" watchObservedRunningTime="2026-03-13 20:32:48.420821003 +0000 UTC m=+299.441936884" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.703682 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:32:48 crc kubenswrapper[4790]: E0313 20:32:48.703965 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.703982 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.704101 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="506dcf0c-8c65-486f-ac8d-e16ba9474095" containerName="route-controller-manager" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.704603 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.706631 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.706645 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.708683 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.708887 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.718355 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.720598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.721848 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.815356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.815852 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.815980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.816167 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917368 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917493 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.917514 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.919240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.919445 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.933923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:48 crc kubenswrapper[4790]: I0313 20:32:48.941123 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"route-controller-manager-5bdf957567-5g6rp\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:49 crc kubenswrapper[4790]: I0313 20:32:49.024340 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.072565 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:32:54 crc kubenswrapper[4790]: W0313 20:32:54.080795 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaa9c85a_2ba8_49ea_804e_f3b63b511642.slice/crio-0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd WatchSource:0}: Error finding container 0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd: Status 404 returned error can't find the container with id 0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.422667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerStarted","Data":"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.424434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerStarted","Data":"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.426310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerStarted","Data":"58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.428002 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerStarted","Data":"69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.430785 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerStarted","Data":"324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.432361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerStarted","Data":"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.432526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerStarted","Data":"0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd"} Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.433122 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.502059 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" podStartSLOduration=10.502038424 podStartE2EDuration="10.502038424s" podCreationTimestamp="2026-03-13 20:32:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:32:54.497360086 +0000 UTC m=+305.518475987" watchObservedRunningTime="2026-03-13 20:32:54.502038424 +0000 UTC m=+305.523154315" Mar 13 20:32:54 crc kubenswrapper[4790]: I0313 20:32:54.876940 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.441047 4790 generic.go:334] "Generic (PLEG): container finished" podID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.441124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.445068 4790 generic.go:334] "Generic (PLEG): container finished" podID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerID="58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.445186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.450647 4790 generic.go:334] "Generic (PLEG): container finished" podID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerID="69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.450728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.454404 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerID="324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.454467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573"} Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.461919 4790 generic.go:334] "Generic (PLEG): container finished" podID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" exitCode=0 Mar 13 20:32:55 crc kubenswrapper[4790]: I0313 20:32:55.461995 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.124983 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.474519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerStarted","Data":"2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.476282 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerStarted","Data":"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.478507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerStarted","Data":"283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.480867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerStarted","Data":"934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a"} Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.483267 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.483289 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.495213 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fxjp7" podStartSLOduration=2.543039907 podStartE2EDuration="1m29.49518795s" podCreationTimestamp="2026-03-13 20:31:28 +0000 UTC" firstStartedPulling="2026-03-13 20:31:30.188874649 +0000 UTC m=+221.209990540" lastFinishedPulling="2026-03-13 20:32:57.141022692 +0000 UTC m=+308.162138583" observedRunningTime="2026-03-13 20:32:57.489830585 +0000 UTC m=+308.510946486" watchObservedRunningTime="2026-03-13 20:32:57.49518795 +0000 UTC m=+308.516303861" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.514537 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5tr4n" podStartSLOduration=3.561432804 podStartE2EDuration="1m33.514519445s" podCreationTimestamp="2026-03-13 20:31:24 +0000 UTC" firstStartedPulling="2026-03-13 20:31:27.053824332 +0000 UTC m=+218.074940223" lastFinishedPulling="2026-03-13 20:32:57.006910973 +0000 UTC m=+308.028026864" observedRunningTime="2026-03-13 20:32:57.511553875 +0000 UTC m=+308.532669766" watchObservedRunningTime="2026-03-13 20:32:57.514519445 +0000 UTC m=+308.535635346" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.520995 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.535556 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-df8gv" podStartSLOduration=2.446325159 podStartE2EDuration="1m32.535540006s" podCreationTimestamp="2026-03-13 20:31:25 +0000 UTC" firstStartedPulling="2026-03-13 20:31:27.103042345 +0000 UTC m=+218.124158246" lastFinishedPulling="2026-03-13 20:32:57.192257202 +0000 UTC m=+308.213373093" observedRunningTime="2026-03-13 20:32:57.53016701 +0000 UTC m=+308.551282901" watchObservedRunningTime="2026-03-13 20:32:57.535540006 +0000 UTC m=+308.556655907" Mar 13 20:32:57 crc kubenswrapper[4790]: I0313 20:32:57.547985 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-672cv" podStartSLOduration=2.650158367 podStartE2EDuration="1m33.547971013s" podCreationTimestamp="2026-03-13 20:31:24 +0000 UTC" firstStartedPulling="2026-03-13 20:31:26.000526991 +0000 UTC m=+217.021642882" lastFinishedPulling="2026-03-13 20:32:56.898339627 +0000 UTC m=+307.919455528" observedRunningTime="2026-03-13 20:32:57.546610005 +0000 UTC m=+308.567725896" watchObservedRunningTime="2026-03-13 20:32:57.547971013 +0000 UTC m=+308.569086904" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.489141 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerStarted","Data":"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723"} Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.516270 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-txx64" podStartSLOduration=4.210761461 podStartE2EDuration="1m34.516252264s" podCreationTimestamp="2026-03-13 20:31:24 +0000 UTC" firstStartedPulling="2026-03-13 20:31:27.048412045 +0000 UTC m=+218.069527936" lastFinishedPulling="2026-03-13 20:32:57.353902848 +0000 UTC m=+308.375018739" observedRunningTime="2026-03-13 20:32:58.513409137 +0000 UTC m=+309.534525058" watchObservedRunningTime="2026-03-13 20:32:58.516252264 +0000 UTC m=+309.537368165" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.529611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.558229 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:32:58 crc kubenswrapper[4790]: I0313 20:32:58.558295 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:32:59 crc kubenswrapper[4790]: I0313 20:32:59.594400 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" probeResult="failure" output=< Mar 13 20:32:59 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:32:59 crc kubenswrapper[4790]: > Mar 13 20:33:00 crc kubenswrapper[4790]: I0313 20:33:00.858330 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:33:00 crc kubenswrapper[4790]: I0313 20:33:00.858573 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mf4tm" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" containerID="cri-o://65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692" gracePeriod=2 Mar 13 20:33:01 crc kubenswrapper[4790]: E0313 20:33:01.982660 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1be7d98_ff3a_42bb_b8ff_4001814ae453.slice/crio-conmon-65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692.scope\": RecentStats: unable to find data in memory cache]" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.516874 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerID="65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692" exitCode=0 Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.516915 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692"} Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.655931 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.756493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") pod \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.756667 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") pod \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.756709 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") pod \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\" (UID: \"f1be7d98-ff3a-42bb-b8ff-4001814ae453\") " Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.757611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities" (OuterVolumeSpecName: "utilities") pod "f1be7d98-ff3a-42bb-b8ff-4001814ae453" (UID: "f1be7d98-ff3a-42bb-b8ff-4001814ae453"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.762230 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv" (OuterVolumeSpecName: "kube-api-access-x5fzv") pod "f1be7d98-ff3a-42bb-b8ff-4001814ae453" (UID: "f1be7d98-ff3a-42bb-b8ff-4001814ae453"). InnerVolumeSpecName "kube-api-access-x5fzv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.781542 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f1be7d98-ff3a-42bb-b8ff-4001814ae453" (UID: "f1be7d98-ff3a-42bb-b8ff-4001814ae453"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.859008 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.859047 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f1be7d98-ff3a-42bb-b8ff-4001814ae453-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:02 crc kubenswrapper[4790]: I0313 20:33:02.859060 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x5fzv\" (UniqueName: \"kubernetes.io/projected/f1be7d98-ff3a-42bb-b8ff-4001814ae453-kube-api-access-x5fzv\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.524352 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mf4tm" event={"ID":"f1be7d98-ff3a-42bb-b8ff-4001814ae453","Type":"ContainerDied","Data":"53cd4a75ebfee1686f2db1e566581c31a9b03470b4313025f3a980087eb27a00"} Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.524437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mf4tm" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.526480 4790 scope.go:117] "RemoveContainer" containerID="65470ced9c79cf36f4934c87e2e1721bdd054f66e3e7ccb08a55f44636a86692" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.544555 4790 scope.go:117] "RemoveContainer" containerID="912e4880b7a24b954e780b6b21a914866c6b2e2fd8684cf3dc798b5f59ce287f" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.551542 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.554260 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mf4tm"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.583650 4790 scope.go:117] "RemoveContainer" containerID="afed47472efd96d5fb96f1be65a82143aad59afc7569141f603e4362a1d44b0e" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.667834 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" path="/var/lib/kubelet/pods/f1be7d98-ff3a-42bb-b8ff-4001814ae453/volumes" Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.933889 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.934137 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" containerID="cri-o://02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" gracePeriod=30 Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.966892 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:33:03 crc kubenswrapper[4790]: I0313 20:33:03.967226 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" containerID="cri-o://ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" gracePeriod=30 Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.520986 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.527347 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.531926 4790 generic.go:334] "Generic (PLEG): container finished" podID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" exitCode=0 Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.531965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.532011 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerDied","Data":"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.532043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f474f899-ksxbf" event={"ID":"677b8903-b2f7-437f-a96d-f72d1ed30de5","Type":"ContainerDied","Data":"e049c1b35eaf105e3a48a1ffc015b42b546f94fe2407bfc286c1b405c7dd9b1b"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.532064 4790 scope.go:117] "RemoveContainer" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.539932 4790 generic.go:334] "Generic (PLEG): container finished" podID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" exitCode=0 Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.539987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerDied","Data":"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.540006 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.540019 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp" event={"ID":"caa9c85a-2ba8-49ea-804e-f3b63b511642","Type":"ContainerDied","Data":"0e94017dcb5c8bf9612bfd88c97145cf3cacc14ab749d3522375343566af18fd"} Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.550206 4790 scope.go:117] "RemoveContainer" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" Mar 13 20:33:04 crc kubenswrapper[4790]: E0313 20:33:04.550701 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6\": container with ID starting with 02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6 not found: ID does not exist" containerID="02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.550746 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6"} err="failed to get container status \"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6\": rpc error: code = NotFound desc = could not find container \"02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6\": container with ID starting with 02dfd36d89af6eaef4ca5fc3bc43aef3e147cf4690a2c85117e1bb5c3f97aaa6 not found: ID does not exist" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.550775 4790 scope.go:117] "RemoveContainer" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.575187 4790 scope.go:117] "RemoveContainer" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" Mar 13 20:33:04 crc kubenswrapper[4790]: E0313 20:33:04.576253 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c\": container with ID starting with ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c not found: ID does not exist" containerID="ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.576323 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c"} err="failed to get container status \"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c\": rpc error: code = NotFound desc = could not find container \"ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c\": container with ID starting with ab603346896b85e9522d565f6f3eaf70aac8bfd3b764995fb50fe1f98eddd66c not found: ID does not exist" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687265 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687341 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687423 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687451 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687520 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687545 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") pod \"677b8903-b2f7-437f-a96d-f72d1ed30de5\" (UID: \"677b8903-b2f7-437f-a96d-f72d1ed30de5\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.687729 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") pod \"caa9c85a-2ba8-49ea-804e-f3b63b511642\" (UID: \"caa9c85a-2ba8-49ea-804e-f3b63b511642\") " Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688188 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688526 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca" (OuterVolumeSpecName: "client-ca") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688728 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config" (OuterVolumeSpecName: "config") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.688911 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca" (OuterVolumeSpecName: "client-ca") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.689183 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config" (OuterVolumeSpecName: "config") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.692855 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr" (OuterVolumeSpecName: "kube-api-access-9z5jr") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "kube-api-access-9z5jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.692894 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r" (OuterVolumeSpecName: "kube-api-access-gh62r") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "kube-api-access-gh62r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.692953 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "677b8903-b2f7-437f-a96d-f72d1ed30de5" (UID: "677b8903-b2f7-437f-a96d-f72d1ed30de5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.693600 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "caa9c85a-2ba8-49ea-804e-f3b63b511642" (UID: "caa9c85a-2ba8-49ea-804e-f3b63b511642"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788907 4790 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788946 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh62r\" (UniqueName: \"kubernetes.io/projected/677b8903-b2f7-437f-a96d-f72d1ed30de5-kube-api-access-gh62r\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788959 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788969 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/677b8903-b2f7-437f-a96d-f72d1ed30de5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788978 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z5jr\" (UniqueName: \"kubernetes.io/projected/caa9c85a-2ba8-49ea-804e-f3b63b511642-kube-api-access-9z5jr\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788987 4790 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/caa9c85a-2ba8-49ea-804e-f3b63b511642-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.788995 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/677b8903-b2f7-437f-a96d-f72d1ed30de5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.789003 4790 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-client-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.789011 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/caa9c85a-2ba8-49ea-804e-f3b63b511642-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.854822 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.854867 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.863981 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.869200 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86f474f899-ksxbf"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.874442 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.877005 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdf957567-5g6rp"] Mar 13 20:33:04 crc kubenswrapper[4790]: I0313 20:33:04.894992 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.201318 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.201620 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.246195 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.260268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.261565 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.309827 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.499914 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.500745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.547317 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.592659 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.600583 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.609944 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.616084 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.667356 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" path="/var/lib/kubelet/pods/677b8903-b2f7-437f-a96d-f72d1ed30de5/volumes" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.668053 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" path="/var/lib/kubelet/pods/caa9c85a-2ba8-49ea-804e-f3b63b511642/volumes" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.710742 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6"] Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711063 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711089 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711120 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-utilities" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711133 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-utilities" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711148 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711161 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711182 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-content" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711193 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="extract-content" Mar 13 20:33:05 crc kubenswrapper[4790]: E0313 20:33:05.711218 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711230 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711368 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="caa9c85a-2ba8-49ea-804e-f3b63b511642" containerName="route-controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711411 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1be7d98-ff3a-42bb-b8ff-4001814ae453" containerName="registry-server" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.711424 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="677b8903-b2f7-437f-a96d-f72d1ed30de5" containerName="controller-manager" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.712010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.713789 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6cdc6994c6-85s67"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.714496 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715128 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715607 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715791 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.715940 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.716046 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.716159 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.717280 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.717558 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.718594 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.718834 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.719030 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.722012 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.722142 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.724359 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.726210 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdc6994c6-85s67"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.817296 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl"] Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.902560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902a53b3-c223-40ae-9dd9-47830295158c-serving-cert\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.902840 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-config\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.902934 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588cbe72-1cb6-4464-bba0-142104029595-serving-cert\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-config\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-client-ca\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-client-ca\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903464 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c6rg\" (UniqueName: \"kubernetes.io/projected/902a53b3-c223-40ae-9dd9-47830295158c-kube-api-access-6c6rg\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903614 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkrxm\" (UniqueName: \"kubernetes.io/projected/588cbe72-1cb6-4464-bba0-142104029595-kube-api-access-qkrxm\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:05 crc kubenswrapper[4790]: I0313 20:33:05.903725 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-proxy-ca-bundles\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004401 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588cbe72-1cb6-4464-bba0-142104029595-serving-cert\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004450 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-config\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004466 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-client-ca\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004490 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-client-ca\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c6rg\" (UniqueName: \"kubernetes.io/projected/902a53b3-c223-40ae-9dd9-47830295158c-kube-api-access-6c6rg\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004522 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkrxm\" (UniqueName: \"kubernetes.io/projected/588cbe72-1cb6-4464-bba0-142104029595-kube-api-access-qkrxm\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004539 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-proxy-ca-bundles\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004567 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902a53b3-c223-40ae-9dd9-47830295158c-serving-cert\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.004615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-config\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.005849 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-client-ca\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.005921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-config\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.006284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-config\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.006671 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/588cbe72-1cb6-4464-bba0-142104029595-client-ca\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.006981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/902a53b3-c223-40ae-9dd9-47830295158c-proxy-ca-bundles\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.011542 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/902a53b3-c223-40ae-9dd9-47830295158c-serving-cert\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.013644 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/588cbe72-1cb6-4464-bba0-142104029595-serving-cert\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.023184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkrxm\" (UniqueName: \"kubernetes.io/projected/588cbe72-1cb6-4464-bba0-142104029595-kube-api-access-qkrxm\") pod \"route-controller-manager-d97755bf4-2ssx6\" (UID: \"588cbe72-1cb6-4464-bba0-142104029595\") " pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.023670 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c6rg\" (UniqueName: \"kubernetes.io/projected/902a53b3-c223-40ae-9dd9-47830295158c-kube-api-access-6c6rg\") pod \"controller-manager-6cdc6994c6-85s67\" (UID: \"902a53b3-c223-40ae-9dd9-47830295158c\") " pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.046682 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.057682 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.285280 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6"] Mar 13 20:33:06 crc kubenswrapper[4790]: W0313 20:33:06.295058 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588cbe72_1cb6_4464_bba0_142104029595.slice/crio-be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24 WatchSource:0}: Error finding container be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24: Status 404 returned error can't find the container with id be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.556812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerStarted","Data":"be6e5104d5f024e382f25f4e0b7ba0a57c041ccf5eec351b6ba16a2f016aee24"} Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.578032 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6cdc6994c6-85s67"] Mar 13 20:33:06 crc kubenswrapper[4790]: W0313 20:33:06.590715 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod902a53b3_c223_40ae_9dd9_47830295158c.slice/crio-fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3 WatchSource:0}: Error finding container fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3: Status 404 returned error can't find the container with id fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.804427 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.806474 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815482 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815523 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815575 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.815703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.857436 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.886499 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.886891 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.886984 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.887025 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.887150 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.887048 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" gracePeriod=15 Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892090 4790 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892638 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892680 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892701 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892717 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892742 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892759 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892778 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892793 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892817 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892837 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892877 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892893 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892916 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892931 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.892951 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.892967 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893203 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893231 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893249 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893371 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893430 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893644 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893674 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.893699 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.894022 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.894047 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.894075 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.894092 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.894336 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916211 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916320 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916653 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916799 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: I0313 20:33:06.916994 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:06 crc kubenswrapper[4790]: E0313 20:33:06.970425 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-6cdc6994c6-85s67.189c80d0a9eab648 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6cdc6994c6-85s67,UID:902a53b3-c223-40ae-9dd9-47830295158c,APIVersion:v1,ResourceVersion:29927,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,LastTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.017514 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.017563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.017588 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118549 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118587 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.118659 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.145403 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:33:07 crc kubenswrapper[4790]: W0313 20:33:07.164234 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133 WatchSource:0}: Error finding container 1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133: Status 404 returned error can't find the container with id 1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.563668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1150ef12fc3ec87620646e26a22e083434d29929c16c8f712df6762ab17cf133"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.565286 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerStarted","Data":"4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.565757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.567398 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.567910 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.568297 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.568874 4790 generic.go:334] "Generic (PLEG): container finished" podID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerID="58f159651637d3217394d3f34d5549bae6158dd0fd270cdccfb0e48c45bc1c2d" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.568959 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerDied","Data":"58f159651637d3217394d3f34d5549bae6158dd0fd270cdccfb0e48c45bc1c2d"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.569615 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.569994 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.570461 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.571148 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.572035 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.573620 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574710 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574730 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574738 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" exitCode=0 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574745 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" exitCode=2 Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.574769 4790 scope.go:117] "RemoveContainer" containerID="39afc2dd3224fcb449078c926b3610f53bd13c92b3b86ee5ccf66fe731d78fab" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.576502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" event={"ID":"902a53b3-c223-40ae-9dd9-47830295158c","Type":"ContainerStarted","Data":"04e6ef7ed6cbd50b8c904b6fb505580ae5c2a5c3fcf9e7bcb50f0d6119d3ac05"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.576544 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" event={"ID":"902a53b3-c223-40ae-9dd9-47830295158c","Type":"ContainerStarted","Data":"fe4d5854cccf493ee1ae1ffd25f4473c94728e417f2e3c6e0cb816d3102db5b3"} Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.577935 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.578475 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.579329 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.580079 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:07 crc kubenswrapper[4790]: I0313 20:33:07.580522 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.565557 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.565883 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.585518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c"} Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.588429 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.589777 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.594415 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.595034 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.595474 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.595868 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.596084 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.596319 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.597271 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.597647 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.597890 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.598146 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.598430 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.598646 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.627872 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.628522 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.628899 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.629097 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.629277 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:08 crc kubenswrapper[4790]: I0313 20:33:08.629469 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.047089 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.048124 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.048745 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.049000 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.049280 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.049629 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.156685 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") pod \"1c05d613-28a6-4eb7-b289-e7d1cad59990\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.156797 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") pod \"1c05d613-28a6-4eb7-b289-e7d1cad59990\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.156955 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") pod \"1c05d613-28a6-4eb7-b289-e7d1cad59990\" (UID: \"1c05d613-28a6-4eb7-b289-e7d1cad59990\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.157280 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1c05d613-28a6-4eb7-b289-e7d1cad59990" (UID: "1c05d613-28a6-4eb7-b289-e7d1cad59990"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.157351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock" (OuterVolumeSpecName: "var-lock") pod "1c05d613-28a6-4eb7-b289-e7d1cad59990" (UID: "1c05d613-28a6-4eb7-b289-e7d1cad59990"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.164300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1c05d613-28a6-4eb7-b289-e7d1cad59990" (UID: "1c05d613-28a6-4eb7-b289-e7d1cad59990"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.258604 4790 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.258643 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1c05d613-28a6-4eb7-b289-e7d1cad59990-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.258655 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c05d613-28a6-4eb7-b289-e7d1cad59990-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.429233 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.430390 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.431104 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.432145 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.432645 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.433111 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.433642 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.434006 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561739 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561765 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561813 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561838 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.561902 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.562083 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.562099 4790 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.562111 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.590101 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.590170 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.596890 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.597094 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"1c05d613-28a6-4eb7-b289-e7d1cad59990","Type":"ContainerDied","Data":"efd1d06a6ce25e4e3fca34226ca853275cb494e1f7d417b592640cdbae34182e"} Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.597150 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efd1d06a6ce25e4e3fca34226ca853275cb494e1f7d417b592640cdbae34182e" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.600037 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.601008 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" exitCode=0 Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.601083 4790 scope.go:117] "RemoveContainer" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.601166 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.602474 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.603091 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.603612 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.604147 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.604475 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.604810 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.610457 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.610788 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.616531 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.616805 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.617057 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.617214 4790 scope.go:117] "RemoveContainer" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.617458 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.622554 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.623084 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.623588 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.623862 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.624130 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.624489 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.629353 4790 scope.go:117] "RemoveContainer" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.641637 4790 scope.go:117] "RemoveContainer" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.654513 4790 scope.go:117] "RemoveContainer" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.663561 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.663842 4790 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.664119 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.664618 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.664987 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.665280 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.665818 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.671584 4790 scope.go:117] "RemoveContainer" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.691278 4790 scope.go:117] "RemoveContainer" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.692208 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\": container with ID starting with 5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20 not found: ID does not exist" containerID="5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692242 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20"} err="failed to get container status \"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\": rpc error: code = NotFound desc = could not find container \"5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20\": container with ID starting with 5e4bc0da59edee0e0615a6298c36c5cf753bae0f9c8c053d8afb49bc4fd46a20 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692265 4790 scope.go:117] "RemoveContainer" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.692632 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\": container with ID starting with d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8 not found: ID does not exist" containerID="d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692675 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8"} err="failed to get container status \"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\": rpc error: code = NotFound desc = could not find container \"d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8\": container with ID starting with d020422a44e5b09c6ec7b68c36a9a32527c8adf61f2377424758ef2edf5870b8 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.692705 4790 scope.go:117] "RemoveContainer" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.693077 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\": container with ID starting with c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5 not found: ID does not exist" containerID="c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693143 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5"} err="failed to get container status \"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\": rpc error: code = NotFound desc = could not find container \"c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5\": container with ID starting with c4c7ff665deedd90f04e5c64da6c52cc97a2acb6746901960f2ffbf82f80c7d5 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693202 4790 scope.go:117] "RemoveContainer" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.693616 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\": container with ID starting with 70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765 not found: ID does not exist" containerID="70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693653 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765"} err="failed to get container status \"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\": rpc error: code = NotFound desc = could not find container \"70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765\": container with ID starting with 70843c72bdc3b4b2a10c911d25978a68cf1c57ff7919c4658cd3bd146823d765 not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.693675 4790 scope.go:117] "RemoveContainer" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.694158 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\": container with ID starting with 0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b not found: ID does not exist" containerID="0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.694224 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b"} err="failed to get container status \"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\": rpc error: code = NotFound desc = could not find container \"0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b\": container with ID starting with 0f5fa319b292be9f3be7d1b2f5b8cb85268c07d88ee839aa955f112b81535a2b not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.694254 4790 scope.go:117] "RemoveContainer" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.694591 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\": container with ID starting with d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b not found: ID does not exist" containerID="d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b" Mar 13 20:33:09 crc kubenswrapper[4790]: I0313 20:33:09.694621 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b"} err="failed to get container status \"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\": rpc error: code = NotFound desc = could not find container \"d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b\": container with ID starting with d370f80001c62c2af3c8f66c2ad8535cb2665609ee0f656b6a4c5c3200efa75b not found: ID does not exist" Mar 13 20:33:09 crc kubenswrapper[4790]: E0313 20:33:09.764357 4790 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" volumeName="registry-storage" Mar 13 20:33:13 crc kubenswrapper[4790]: E0313 20:33:13.189274 4790 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/events\": dial tcp 38.102.83.143:6443: connect: connection refused" event="&Event{ObjectMeta:{controller-manager-6cdc6994c6-85s67.189c80d0a9eab648 openshift-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-controller-manager,Name:controller-manager-6cdc6994c6-85s67,UID:902a53b3-c223-40ae-9dd9-47830295158c,APIVersion:v1,ResourceVersion:29927,FieldPath:spec.containers{controller-manager},},Reason:Created,Message:Created container controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,LastTimestamp:2026-03-13 20:33:06.969417288 +0000 UTC m=+317.990533189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.248484 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.249037 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.249495 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.249809 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.250078 4790 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:14 crc kubenswrapper[4790]: I0313 20:33:14.250115 4790 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.250336 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="200ms" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.450857 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="400ms" Mar 13 20:33:14 crc kubenswrapper[4790]: E0313 20:33:14.852297 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="800ms" Mar 13 20:33:15 crc kubenswrapper[4790]: E0313 20:33:15.653856 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="1.6s" Mar 13 20:33:17 crc kubenswrapper[4790]: I0313 20:33:17.047799 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:17 crc kubenswrapper[4790]: I0313 20:33:17.047859 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:17 crc kubenswrapper[4790]: E0313 20:33:17.255240 4790 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.143:6443: connect: connection refused" interval="3.2s" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.659795 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.660504 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.661746 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.662012 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.662262 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.662533 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.672878 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.673002 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:18 crc kubenswrapper[4790]: E0313 20:33:18.673443 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:18 crc kubenswrapper[4790]: I0313 20:33:18.673925 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.664931 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.665647 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.665900 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.666250 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.666724 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.667011 4790 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.693893 4790 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="e5fbb63e2245590ceae285d78e589e8f5934bc1a24c72f3f77d23c9facc5745a" exitCode=0 Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.693936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"e5fbb63e2245590ceae285d78e589e8f5934bc1a24c72f3f77d23c9facc5745a"} Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.693960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a93f5d4b50c5daa9e660832fd8936842c2289e878c0cf5cafd5b1c17e110a430"} Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.694195 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.694207 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.694623 4790 status_manager.go:851] "Failed to get status for pod" podUID="902a53b3-c223-40ae-9dd9-47830295158c" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6cdc6994c6-85s67\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: E0313 20:33:19.694773 4790 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695032 4790 status_manager.go:851] "Failed to get status for pod" podUID="588cbe72-1cb6-4464-bba0-142104029595" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-d97755bf4-2ssx6\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695261 4790 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695546 4790 status_manager.go:851] "Failed to get status for pod" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" pod="openshift-marketplace/redhat-operators-fxjp7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fxjp7\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.695781 4790 status_manager.go:851] "Failed to get status for pod" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:19 crc kubenswrapper[4790]: I0313 20:33:19.696072 4790 status_manager.go:851] "Failed to get status for pod" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.143:6443: connect: connection refused" Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c58e944651629812fb1f709b068ebfe7b62d91872a1307be5be697285ef730cc"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701596 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"01475734fe300f09a2d17bf8f8df03cdd784fad61de5cf01ddb519327c89b788"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b5555b03f24880db677285e78c50a4b4bf44c68867fd113ae11a8400217fd2d7"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.701623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"6c3fced6e181d28f1686c7b499da58f2ea7f411ae0404aad88694a4e0c251831"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.703556 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.703991 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.704029 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe" exitCode=1 Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.704053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe"} Mar 13 20:33:20 crc kubenswrapper[4790]: I0313 20:33:20.704564 4790 scope.go:117] "RemoveContainer" containerID="341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.711448 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.712186 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.712294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4"} Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715321 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"5c8b521db5ddd733e8e2fe0d342090b0d72dc8a176c4374ba2a67b1e082ba497"} Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715523 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715551 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:21 crc kubenswrapper[4790]: I0313 20:33:21.715568 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:22 crc kubenswrapper[4790]: I0313 20:33:22.699516 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:33:23 crc kubenswrapper[4790]: I0313 20:33:23.674950 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:23 crc kubenswrapper[4790]: I0313 20:33:23.675280 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:23 crc kubenswrapper[4790]: I0313 20:33:23.682861 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:26 crc kubenswrapper[4790]: I0313 20:33:26.725946 4790 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.048411 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.048486 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.751291 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.751328 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.755986 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:27 crc kubenswrapper[4790]: I0313 20:33:27.758496 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6d72aeee-4f97-4816-89d8-511a753d2f70" Mar 13 20:33:28 crc kubenswrapper[4790]: I0313 20:33:28.757613 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:28 crc kubenswrapper[4790]: I0313 20:33:28.757646 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:29 crc kubenswrapper[4790]: I0313 20:33:29.699562 4790 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="6d72aeee-4f97-4816-89d8-511a753d2f70" Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.260860 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.261050 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.263250 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 20:33:30 crc kubenswrapper[4790]: I0313 20:33:30.850756 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" containerID="cri-o://4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383" gracePeriod=15 Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779317 4790 generic.go:334] "Generic (PLEG): container finished" podID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerID="4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383" exitCode=0 Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerDied","Data":"4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383"} Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779489 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" event={"ID":"9680aeb7-b61a-46a8-baf5-44715261e4a5","Type":"ContainerDied","Data":"7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54"} Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.779517 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e7e7141df31dfc4ded27d369062544f96ae747ef387acfa5853705562325a54" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.800136 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982510 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982575 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982618 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982648 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982682 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982757 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982786 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982829 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982873 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982904 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.982966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.983000 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") pod \"9680aeb7-b61a-46a8-baf5-44715261e4a5\" (UID: \"9680aeb7-b61a-46a8-baf5-44715261e4a5\") " Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.984285 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.984883 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.984936 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:33:31 crc kubenswrapper[4790]: I0313 20:33:31.998938 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.011958 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.014229 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.014343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.021310 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.022724 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8" (OuterVolumeSpecName: "kube-api-access-9fdm8") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "kube-api-access-9fdm8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.022841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.023579 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.023908 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.025530 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.036893 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "9680aeb7-b61a-46a8-baf5-44715261e4a5" (UID: "9680aeb7-b61a-46a8-baf5-44715261e4a5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084068 4790 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084100 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084111 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084120 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084129 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084140 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084181 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084192 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9fdm8\" (UniqueName: \"kubernetes.io/projected/9680aeb7-b61a-46a8-baf5-44715261e4a5-kube-api-access-9fdm8\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084203 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084212 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084221 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084229 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084237 4790 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.084246 4790 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9680aeb7-b61a-46a8-baf5-44715261e4a5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:33:32 crc kubenswrapper[4790]: I0313 20:33:32.813215 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-szftl" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.047282 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.047330 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.048988 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.049094 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.864496 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-d97755bf4-2ssx6_588cbe72-1cb6-4464-bba0-142104029595/route-controller-manager/0.log" Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.864574 4790 generic.go:334] "Generic (PLEG): container finished" podID="588cbe72-1cb6-4464-bba0-142104029595" containerID="4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30" exitCode=255 Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.864621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerDied","Data":"4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30"} Mar 13 20:33:37 crc kubenswrapper[4790]: I0313 20:33:37.865270 4790 scope.go:117] "RemoveContainer" containerID="4c743e6d4f9c4d8ec78f2e9ce9d9828659f3f57c4e824415c3fc41b86d4afe30" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.184434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.382094 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.485023 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.707371 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.751145 4790 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.872102 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-d97755bf4-2ssx6_588cbe72-1cb6-4464-bba0-142104029595/route-controller-manager/0.log" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.872149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" event={"ID":"588cbe72-1cb6-4464-bba0-142104029595","Type":"ContainerStarted","Data":"acb4633f1c59279d63c3c311f5e9691cd648254a5622f51f14f0b0357bc20516"} Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.872680 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:38 crc kubenswrapper[4790]: I0313 20:33:38.878864 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.038547 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.057220 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.107903 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.125690 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.216834 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.410694 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.515929 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.661897 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.733418 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.865410 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.872569 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.872646 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.879641 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.900310 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 20:33:39 crc kubenswrapper[4790]: I0313 20:33:39.928746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.009945 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.189509 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.230575 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.230660 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.324489 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.411069 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.713264 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.877814 4790 patch_prober.go:28] interesting pod/route-controller-manager-d97755bf4-2ssx6 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 20:33:40 crc kubenswrapper[4790]: I0313 20:33:40.878152 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podUID="588cbe72-1cb6-4464-bba0-142104029595" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.65:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.007166 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.038408 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.045639 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.171403 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.226645 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.246041 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.359432 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.375010 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.400823 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.484957 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.612720 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.621193 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.776517 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.819184 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.881652 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 13 20:33:41 crc kubenswrapper[4790]: I0313 20:33:41.903003 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.000101 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.000226 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.008259 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.058950 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.128339 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.139198 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.193273 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.241513 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.243671 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.284945 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.305584 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.359026 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.468571 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.650280 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.815650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.849114 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.849131 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 20:33:42 crc kubenswrapper[4790]: I0313 20:33:42.871736 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.078244 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.144973 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.168767 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.233046 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.304075 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.310389 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.334547 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.368559 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.419629 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.642042 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.731155 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.767054 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.880197 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.893857 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.922877 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.929142 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.936086 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.937928 4790 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.969519 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.994422 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 20:33:43 crc kubenswrapper[4790]: I0313 20:33:43.996990 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.232641 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.343467 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.377434 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.474853 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.635345 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.658700 4790 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.692848 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.724738 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.724977 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.809119 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.837558 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.871252 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.886079 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.937456 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 20:33:44 crc kubenswrapper[4790]: I0313 20:33:44.955179 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.067691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.159899 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.196510 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.218813 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.248966 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.353360 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.455905 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.496363 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.506350 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.523873 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.543270 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.612094 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.672494 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.703163 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.785272 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.816502 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.831866 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 20:33:45 crc kubenswrapper[4790]: I0313 20:33:45.844950 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.051096 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.073969 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.224865 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.238010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.257511 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.315291 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.370504 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.433156 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.452902 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.468396 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.475529 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.501101 4790 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.585770 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.617399 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.708119 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.727931 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.741027 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.769245 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.769286 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.805172 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.834782 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:33:46 crc kubenswrapper[4790]: I0313 20:33:46.879145 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.000842 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.099335 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.208322 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.424736 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.621934 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.645399 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.651563 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.661352 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.697780 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.871812 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.902896 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.917133 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.934994 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 13 20:33:47 crc kubenswrapper[4790]: I0313 20:33:47.982010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.006339 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.023488 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.080131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.193322 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.312133 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.327292 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.334901 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.348150 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.451156 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.481511 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.504809 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.544304 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.587841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.596519 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.685936 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.692274 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.794903 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.795589 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.818681 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.855829 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.867832 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.892162 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 20:33:48 crc kubenswrapper[4790]: I0313 20:33:48.972240 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.099848 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.190206 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.274792 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.281981 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.286844 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.316402 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.326519 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.434024 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.535366 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.566003 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.586537 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.592155 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.593620 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.621137 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.621729 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.631115 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.709268 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 20:33:49 crc kubenswrapper[4790]: I0313 20:33:49.749412 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.033812 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.084407 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.134450 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.141070 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.170053 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.200750 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.230473 4790 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.230530 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.230584 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.231225 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.231349 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4" gracePeriod=30 Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.523474 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.592901 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.706541 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.736928 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 20:33:50 crc kubenswrapper[4790]: I0313 20:33:50.769855 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.004253 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.182899 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.224689 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.285841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.340578 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.360685 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.525305 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.651617 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.659524 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.880826 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 20:33:51 crc kubenswrapper[4790]: I0313 20:33:51.989318 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.047014 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.203485 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.217715 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.288470 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.360756 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.408506 4790 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.420760 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.443539 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.468984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.533441 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.563002 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.685119 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.926205 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 20:33:52 crc kubenswrapper[4790]: I0313 20:33:52.969645 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.119344 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.130169 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.281912 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.281979 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.380725 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.385272 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 20:33:53 crc kubenswrapper[4790]: I0313 20:33:53.620294 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.267075 4790 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.268366 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=48.268340431 podStartE2EDuration="48.268340431s" podCreationTimestamp="2026-03-13 20:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:26.444772068 +0000 UTC m=+337.465887959" watchObservedRunningTime="2026-03-13 20:33:54.268340431 +0000 UTC m=+365.289456352" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.269915 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-d97755bf4-2ssx6" podStartSLOduration=51.269890374 podStartE2EDuration="51.269890374s" podCreationTimestamp="2026-03-13 20:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:26.435943188 +0000 UTC m=+337.457059079" watchObservedRunningTime="2026-03-13 20:33:54.269890374 +0000 UTC m=+365.291006305" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.273207 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6cdc6994c6-85s67" podStartSLOduration=51.273196173 podStartE2EDuration="51.273196173s" podCreationTimestamp="2026-03-13 20:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:26.422073972 +0000 UTC m=+337.443189873" watchObservedRunningTime="2026-03-13 20:33:54.273196173 +0000 UTC m=+365.294312104" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275116 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-szftl","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275229 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq","openshift-kube-apiserver/kube-apiserver-crc"] Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275823 4790 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.275852 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="e4da2be5-d947-41bd-b381-0b9eae10293d" Mar 13 20:33:54 crc kubenswrapper[4790]: E0313 20:33:54.276639 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerName="installer" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276670 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerName="installer" Mar 13 20:33:54 crc kubenswrapper[4790]: E0313 20:33:54.276711 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276729 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276949 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c05d613-28a6-4eb7-b289-e7d1cad59990" containerName="installer" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.276996 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" containerName="oauth-openshift" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.277888 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.281368 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.281854 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.282097 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.283238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.283552 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.283736 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285046 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285549 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.285884 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.286606 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.287124 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.287454 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.292187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.296333 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.300755 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.305629 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.345810 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=28.345791624 podStartE2EDuration="28.345791624s" podCreationTimestamp="2026-03-13 20:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:54.312083849 +0000 UTC m=+365.333199750" watchObservedRunningTime="2026-03-13 20:33:54.345791624 +0000 UTC m=+365.366907525" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.397838 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.397880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-error\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.397967 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78308f12-fefa-41d3-845f-009863f92a51-audit-dir\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398005 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-service-ca\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398058 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398196 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-router-certs\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9w8hp\" (UniqueName: \"kubernetes.io/projected/78308f12-fefa-41d3-845f-009863f92a51-kube-api-access-9w8hp\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-audit-policies\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-session\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398331 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-login\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398428 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.398490 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.474768 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-login\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499576 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499596 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-error\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499623 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78308f12-fefa-41d3-845f-009863f92a51-audit-dir\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-service-ca\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499684 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-router-certs\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9w8hp\" (UniqueName: \"kubernetes.io/projected/78308f12-fefa-41d3-845f-009863f92a51-kube-api-access-9w8hp\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499792 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-audit-policies\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.499812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-session\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.501188 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/78308f12-fefa-41d3-845f-009863f92a51-audit-dir\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.501783 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-audit-policies\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.502202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-service-ca\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.503183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.504349 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.506332 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-router-certs\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.507016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-error\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.507220 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-login\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.509153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.511286 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.512321 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.513343 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-system-session\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.513678 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/78308f12-fefa-41d3-845f-009863f92a51-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.521649 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9w8hp\" (UniqueName: \"kubernetes.io/projected/78308f12-fefa-41d3-845f-009863f92a51-kube-api-access-9w8hp\") pod \"oauth-openshift-6467d9dbc9-4l2sq\" (UID: \"78308f12-fefa-41d3-845f-009863f92a51\") " pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.714669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.858803 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 20:33:54 crc kubenswrapper[4790]: I0313 20:33:54.932841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.158732 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq"] Mar 13 20:33:55 crc kubenswrapper[4790]: W0313 20:33:55.163795 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78308f12_fefa_41d3_845f_009863f92a51.slice/crio-b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f WatchSource:0}: Error finding container b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f: Status 404 returned error can't find the container with id b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.251533 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.674847 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9680aeb7-b61a-46a8-baf5-44715261e4a5" path="/var/lib/kubelet/pods/9680aeb7-b61a-46a8-baf5-44715261e4a5/volumes" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.961841 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" event={"ID":"78308f12-fefa-41d3-845f-009863f92a51","Type":"ContainerStarted","Data":"c13c4946072e428c448655c578855828cd450f099432a68247012368d8cfd9cf"} Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.961900 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" event={"ID":"78308f12-fefa-41d3-845f-009863f92a51","Type":"ContainerStarted","Data":"b846835915cf2202ae7004f2bc21c38e8be6564452e3ce976ac09b248165446f"} Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.962098 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:33:55 crc kubenswrapper[4790]: I0313 20:33:55.988660 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" podStartSLOduration=50.988643961 podStartE2EDuration="50.988643961s" podCreationTimestamp="2026-03-13 20:33:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:33:55.985676141 +0000 UTC m=+367.006792032" watchObservedRunningTime="2026-03-13 20:33:55.988643961 +0000 UTC m=+367.009759852" Mar 13 20:33:56 crc kubenswrapper[4790]: I0313 20:33:56.170011 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6467d9dbc9-4l2sq" Mar 13 20:34:00 crc kubenswrapper[4790]: I0313 20:34:00.327744 4790 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:34:00 crc kubenswrapper[4790]: I0313 20:34:00.328224 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" gracePeriod=5 Mar 13 20:34:05 crc kubenswrapper[4790]: I0313 20:34:05.897230 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 20:34:05 crc kubenswrapper[4790]: I0313 20:34:05.897917 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020320 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020400 4790 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" exitCode=137 Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020446 4790 scope.go:117] "RemoveContainer" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.020485 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.045630 4790 scope.go:117] "RemoveContainer" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" Mar 13 20:34:06 crc kubenswrapper[4790]: E0313 20:34:06.046226 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c\": container with ID starting with 5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c not found: ID does not exist" containerID="5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.046274 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c"} err="failed to get container status \"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c\": rpc error: code = NotFound desc = could not find container \"5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c\": container with ID starting with 5effc06bf48d765836aa18784b4a5c05009cf94e90166ba523e8366d5ef9948c not found: ID does not exist" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.053955 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054096 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054174 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054294 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054411 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054854 4790 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054892 4790 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054920 4790 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.054944 4790 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.070755 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:34:06 crc kubenswrapper[4790]: I0313 20:34:06.156252 4790 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.667322 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.667834 4790 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.681447 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.681480 4790 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7640fba-05c9-458e-a161-3dafdd60af62" Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.685710 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 13 20:34:07 crc kubenswrapper[4790]: I0313 20:34:07.685749 4790 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="a7640fba-05c9-458e-a161-3dafdd60af62" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.399508 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:34:15 crc kubenswrapper[4790]: E0313 20:34:15.401279 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.401360 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.401595 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.402132 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.403839 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.405647 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.405914 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.415206 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.479180 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"auto-csr-approver-29557234-6g6zh\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.580795 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"auto-csr-approver-29557234-6g6zh\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.612456 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"auto-csr-approver-29557234-6g6zh\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:15 crc kubenswrapper[4790]: I0313 20:34:15.729872 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:16 crc kubenswrapper[4790]: I0313 20:34:16.130131 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:34:17 crc kubenswrapper[4790]: I0313 20:34:17.095640 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" event={"ID":"6b8e0ffa-a21f-4726-8185-2cff61c94b91","Type":"ContainerStarted","Data":"84311e084a32f72f4c06887a54648dd1d34db74a2eab079a4647daa3afee4d12"} Mar 13 20:34:18 crc kubenswrapper[4790]: I0313 20:34:18.103369 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerID="4a133641d0a543ddd92802af2ba335acfaf29e7ed5636f43383cb7790a817cba" exitCode=0 Mar 13 20:34:18 crc kubenswrapper[4790]: I0313 20:34:18.103499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" event={"ID":"6b8e0ffa-a21f-4726-8185-2cff61c94b91","Type":"ContainerDied","Data":"4a133641d0a543ddd92802af2ba335acfaf29e7ed5636f43383cb7790a817cba"} Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.447468 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.525722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") pod \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\" (UID: \"6b8e0ffa-a21f-4726-8185-2cff61c94b91\") " Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.532593 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc" (OuterVolumeSpecName: "kube-api-access-4pvcc") pod "6b8e0ffa-a21f-4726-8185-2cff61c94b91" (UID: "6b8e0ffa-a21f-4726-8185-2cff61c94b91"). InnerVolumeSpecName "kube-api-access-4pvcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:19 crc kubenswrapper[4790]: I0313 20:34:19.627557 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pvcc\" (UniqueName: \"kubernetes.io/projected/6b8e0ffa-a21f-4726-8185-2cff61c94b91-kube-api-access-4pvcc\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:20 crc kubenswrapper[4790]: I0313 20:34:20.123889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" event={"ID":"6b8e0ffa-a21f-4726-8185-2cff61c94b91","Type":"ContainerDied","Data":"84311e084a32f72f4c06887a54648dd1d34db74a2eab079a4647daa3afee4d12"} Mar 13 20:34:20 crc kubenswrapper[4790]: I0313 20:34:20.124207 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84311e084a32f72f4c06887a54648dd1d34db74a2eab079a4647daa3afee4d12" Mar 13 20:34:20 crc kubenswrapper[4790]: I0313 20:34:20.123978 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557234-6g6zh" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.133056 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135022 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135551 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135591 4790 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4" exitCode=137 Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135621 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c72290a070b857bc3ddf32c051800ee3fe9e55397ddcdfd5d29c98edd59be0a4"} Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"1e56deb82f678c244918661e5eb7ebb160fe4919974a43590f2a4219eb47bf01"} Mar 13 20:34:21 crc kubenswrapper[4790]: I0313 20:34:21.135668 4790 scope.go:117] "RemoveContainer" containerID="341e941bef336568ae81aba85e7bbeb4a08c7e3fee6201bf7a2adac679b908fe" Mar 13 20:34:22 crc kubenswrapper[4790]: I0313 20:34:22.142711 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 13 20:34:22 crc kubenswrapper[4790]: I0313 20:34:22.144050 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 13 20:34:22 crc kubenswrapper[4790]: I0313 20:34:22.699771 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:30 crc kubenswrapper[4790]: I0313 20:34:30.229942 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:30 crc kubenswrapper[4790]: I0313 20:34:30.237565 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:31 crc kubenswrapper[4790]: I0313 20:34:31.206644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 13 20:34:31 crc kubenswrapper[4790]: I0313 20:34:31.846738 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:34:31 crc kubenswrapper[4790]: I0313 20:34:31.846990 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-df8gv" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" containerID="cri-o://934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a" gracePeriod=2 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.050327 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.050974 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5tr4n" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" containerID="cri-o://283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38" gracePeriod=2 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.211582 4790 generic.go:334] "Generic (PLEG): container finished" podID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerID="934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a" exitCode=0 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.211657 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a"} Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.213804 4790 generic.go:334] "Generic (PLEG): container finished" podID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerID="283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38" exitCode=0 Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.214469 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38"} Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.270013 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.388173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") pod \"da03af74-8c59-4ccf-aff8-03dc6303e322\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.388233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") pod \"da03af74-8c59-4ccf-aff8-03dc6303e322\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.388347 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") pod \"da03af74-8c59-4ccf-aff8-03dc6303e322\" (UID: \"da03af74-8c59-4ccf-aff8-03dc6303e322\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.389034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities" (OuterVolumeSpecName: "utilities") pod "da03af74-8c59-4ccf-aff8-03dc6303e322" (UID: "da03af74-8c59-4ccf-aff8-03dc6303e322"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.389583 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.393891 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57" (OuterVolumeSpecName: "kube-api-access-fwk57") pod "da03af74-8c59-4ccf-aff8-03dc6303e322" (UID: "da03af74-8c59-4ccf-aff8-03dc6303e322"). InnerVolumeSpecName "kube-api-access-fwk57". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.394931 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.439041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "da03af74-8c59-4ccf-aff8-03dc6303e322" (UID: "da03af74-8c59-4ccf-aff8-03dc6303e322"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490278 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") pod \"446f0f4c-a97c-47d0-929d-0b99e07c8186\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490351 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") pod \"446f0f4c-a97c-47d0-929d-0b99e07c8186\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490514 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") pod \"446f0f4c-a97c-47d0-929d-0b99e07c8186\" (UID: \"446f0f4c-a97c-47d0-929d-0b99e07c8186\") " Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490756 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fwk57\" (UniqueName: \"kubernetes.io/projected/da03af74-8c59-4ccf-aff8-03dc6303e322-kube-api-access-fwk57\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.490798 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/da03af74-8c59-4ccf-aff8-03dc6303e322-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.491478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities" (OuterVolumeSpecName: "utilities") pod "446f0f4c-a97c-47d0-929d-0b99e07c8186" (UID: "446f0f4c-a97c-47d0-929d-0b99e07c8186"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.494095 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw" (OuterVolumeSpecName: "kube-api-access-4dmtw") pod "446f0f4c-a97c-47d0-929d-0b99e07c8186" (UID: "446f0f4c-a97c-47d0-929d-0b99e07c8186"). InnerVolumeSpecName "kube-api-access-4dmtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.539472 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "446f0f4c-a97c-47d0-929d-0b99e07c8186" (UID: "446f0f4c-a97c-47d0-929d-0b99e07c8186"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.592067 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.592097 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dmtw\" (UniqueName: \"kubernetes.io/projected/446f0f4c-a97c-47d0-929d-0b99e07c8186-kube-api-access-4dmtw\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:32 crc kubenswrapper[4790]: I0313 20:34:32.592109 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/446f0f4c-a97c-47d0-929d-0b99e07c8186-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.220928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-df8gv" event={"ID":"da03af74-8c59-4ccf-aff8-03dc6303e322","Type":"ContainerDied","Data":"9761002ea58d403e6092f58c313ddf3e3892646900d306f6d06f23ff553f5760"} Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.221031 4790 scope.go:117] "RemoveContainer" containerID="934478e1636def539b4b75131eeeef3a5a527bcd02efeeb3dc4dc663186f9f4a" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.220959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-df8gv" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.225094 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5tr4n" event={"ID":"446f0f4c-a97c-47d0-929d-0b99e07c8186","Type":"ContainerDied","Data":"1f3bbc4d7d37e2d400e1366f116e79095d38ddf23a471dd30cc3d7e41c04740d"} Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.225216 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5tr4n" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.240678 4790 scope.go:117] "RemoveContainer" containerID="69f53c59d1e74a1fc57678e4a1a5f136fbff7feef571b3a55782dea49bf4ca77" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.251787 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.255920 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-df8gv"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.265851 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.282926 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5tr4n"] Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.289360 4790 scope.go:117] "RemoveContainer" containerID="18d45729b57b0625b6ac059bc91aedd72d39045472cf08d5152f47c470f71f43" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.313646 4790 scope.go:117] "RemoveContainer" containerID="283af7b78e5df22c61725b66908c69af3f6b7ed01b3dc5cf3a313cb16df58c38" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.332554 4790 scope.go:117] "RemoveContainer" containerID="58a3c18d60db23fb517df83cf8f798fb4a929be2cac998373fad7a7e27e0143b" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.364446 4790 scope.go:117] "RemoveContainer" containerID="33326be198fd78688d8c0e82df3982727cfbc7e94ef4969d1503af495b1859ed" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.665721 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" path="/var/lib/kubelet/pods/446f0f4c-a97c-47d0-929d-0b99e07c8186/volumes" Mar 13 20:34:33 crc kubenswrapper[4790]: I0313 20:34:33.666742 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" path="/var/lib/kubelet/pods/da03af74-8c59-4ccf-aff8-03dc6303e322/volumes" Mar 13 20:34:37 crc kubenswrapper[4790]: I0313 20:34:37.847197 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:34:37 crc kubenswrapper[4790]: I0313 20:34:37.847930 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fxjp7" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" containerID="cri-o://2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce" gracePeriod=2 Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.252567 4790 generic.go:334] "Generic (PLEG): container finished" podID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerID="2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce" exitCode=0 Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.252624 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce"} Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.313656 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.461048 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") pod \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.461132 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") pod \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.461204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") pod \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\" (UID: \"4aa0c26b-aef8-49e9-9904-da9e8d029c9d\") " Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.462202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities" (OuterVolumeSpecName: "utilities") pod "4aa0c26b-aef8-49e9-9904-da9e8d029c9d" (UID: "4aa0c26b-aef8-49e9-9904-da9e8d029c9d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.465804 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f" (OuterVolumeSpecName: "kube-api-access-vhj8f") pod "4aa0c26b-aef8-49e9-9904-da9e8d029c9d" (UID: "4aa0c26b-aef8-49e9-9904-da9e8d029c9d"). InnerVolumeSpecName "kube-api-access-vhj8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.562993 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.563107 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhj8f\" (UniqueName: \"kubernetes.io/projected/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-kube-api-access-vhj8f\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.591544 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4aa0c26b-aef8-49e9-9904-da9e8d029c9d" (UID: "4aa0c26b-aef8-49e9-9904-da9e8d029c9d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:34:38 crc kubenswrapper[4790]: I0313 20:34:38.668125 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4aa0c26b-aef8-49e9-9904-da9e8d029c9d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.260363 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fxjp7" event={"ID":"4aa0c26b-aef8-49e9-9904-da9e8d029c9d","Type":"ContainerDied","Data":"050e353cf4b2b386c77190a755e24b1d103134a927f84598ad4dbf53d6d3a4fa"} Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.260466 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fxjp7" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.260723 4790 scope.go:117] "RemoveContainer" containerID="2991bdf1214b34771f3920c4e5c74e4a6f7ce03bf40eb290c472871cdaa464ce" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.274881 4790 scope.go:117] "RemoveContainer" containerID="324ef417e590b70303b2a28886536562959e53b4d52847bd1309db91eab7a573" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.290771 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.295041 4790 scope.go:117] "RemoveContainer" containerID="f276b163ccc0d21403b49d02b3c506a94213d0bcc943d5fcede3603bc020ebfc" Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.296233 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fxjp7"] Mar 13 20:34:39 crc kubenswrapper[4790]: I0313 20:34:39.665946 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" path="/var/lib/kubelet/pods/4aa0c26b-aef8-49e9-9904-da9e8d029c9d/volumes" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765041 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xf7s4"] Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765308 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765327 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765345 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765353 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765366 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765391 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765406 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765413 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765427 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerName="oc" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765435 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerName="oc" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765444 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765452 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765459 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765466 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765477 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765484 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-utilities" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765494 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765502 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: E0313 20:34:40.765512 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765519 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="extract-content" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765648 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aa0c26b-aef8-49e9-9904-da9e8d029c9d" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765664 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" containerName="oc" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765682 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="da03af74-8c59-4ccf-aff8-03dc6303e322" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.765697 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="446f0f4c-a97c-47d0-929d-0b99e07c8186" containerName="registry-server" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.766128 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.791648 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xf7s4"] Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.893632 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-certificates\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.893898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-trusted-ca\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894022 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894151 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894258 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-bound-sa-token\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894353 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsl42\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-kube-api-access-tsl42\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894461 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.894543 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-tls\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.911793 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-bound-sa-token\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsl42\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-kube-api-access-tsl42\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995579 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-tls\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-certificates\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.995627 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-trusted-ca\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.996490 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.997126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-certificates\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:40 crc kubenswrapper[4790]: I0313 20:34:40.997140 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-trusted-ca\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.000438 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-registry-tls\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.010326 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-bound-sa-token\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.016179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.019180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsl42\" (UniqueName: \"kubernetes.io/projected/180ef86a-6ccb-4c72-9722-be08fb3c8bc7-kube-api-access-tsl42\") pod \"image-registry-66df7c8f76-xf7s4\" (UID: \"180ef86a-6ccb-4c72-9722-be08fb3c8bc7\") " pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.081304 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:41 crc kubenswrapper[4790]: I0313 20:34:41.471578 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-xf7s4"] Mar 13 20:34:42 crc kubenswrapper[4790]: I0313 20:34:42.279885 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" event={"ID":"180ef86a-6ccb-4c72-9722-be08fb3c8bc7","Type":"ContainerStarted","Data":"54695274d9f3009dd8466f0dee6264bd5c891fc94c0257d695ff542d0cf8fe96"} Mar 13 20:34:42 crc kubenswrapper[4790]: I0313 20:34:42.280219 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:34:42 crc kubenswrapper[4790]: I0313 20:34:42.280231 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" event={"ID":"180ef86a-6ccb-4c72-9722-be08fb3c8bc7","Type":"ContainerStarted","Data":"13eae60b8677d2ee63a2140ef49608d4470d49da2be0ca0f3a2066fd422617ec"} Mar 13 20:34:44 crc kubenswrapper[4790]: I0313 20:34:44.015517 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:34:44 crc kubenswrapper[4790]: I0313 20:34:44.015573 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:35:01 crc kubenswrapper[4790]: I0313 20:35:01.086675 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" Mar 13 20:35:01 crc kubenswrapper[4790]: I0313 20:35:01.127007 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-xf7s4" podStartSLOduration=21.12697509 podStartE2EDuration="21.12697509s" podCreationTimestamp="2026-03-13 20:34:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:34:42.302464489 +0000 UTC m=+413.323580380" watchObservedRunningTime="2026-03-13 20:35:01.12697509 +0000 UTC m=+432.148091021" Mar 13 20:35:01 crc kubenswrapper[4790]: I0313 20:35:01.152358 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:35:14 crc kubenswrapper[4790]: I0313 20:35:14.015717 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:35:14 crc kubenswrapper[4790]: I0313 20:35:14.016544 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.043241 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.044081 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-txx64" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" containerID="cri-o://e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.081870 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.082148 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-672cv" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" containerID="cri-o://cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.088293 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.089439 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" containerID="cri-o://ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.090997 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n548b"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.091926 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.095168 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.095547 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bq4pj" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" containerID="cri-o://674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.099222 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.099493 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-hnd2l" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" containerID="cri-o://85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af" gracePeriod=30 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.107086 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n548b"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.238592 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6jzp\" (UniqueName: \"kubernetes.io/projected/97fe66e8-7366-4c61-b1db-4d98459834da-kube-api-access-z6jzp\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.238951 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.239035 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.340102 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.340215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6jzp\" (UniqueName: \"kubernetes.io/projected/97fe66e8-7366-4c61-b1db-4d98459834da-kube-api-access-z6jzp\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.340257 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.346201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.354827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97fe66e8-7366-4c61-b1db-4d98459834da-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.375113 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6jzp\" (UniqueName: \"kubernetes.io/projected/97fe66e8-7366-4c61-b1db-4d98459834da-kube-api-access-z6jzp\") pod \"marketplace-operator-79b997595-n548b\" (UID: \"97fe66e8-7366-4c61-b1db-4d98459834da\") " pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.414818 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.530160 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.532087 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.544052 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557453 4790 generic.go:334] "Generic (PLEG): container finished" podID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerDied","Data":"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557568 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" event={"ID":"53c38463-b7c5-42c8-a447-7d0e7f190aa9","Type":"ContainerDied","Data":"bad985ac5d6a6fd6a14b185a97704f5e25df7aba222388f921733e6977b5b5eb"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557590 4790 scope.go:117] "RemoveContainer" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.557623 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-jnbzb" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.570989 4790 generic.go:334] "Generic (PLEG): container finished" podID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerID="674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.571075 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.573962 4790 generic.go:334] "Generic (PLEG): container finished" podID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.574023 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.574057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-txx64" event={"ID":"7080e6b3-5934-4c2c-9361-23d20b5a495e","Type":"ContainerDied","Data":"5bff08277bee799461658bd86530c13fa744a49d2daab25cbda9f9c23ac16aa2"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.574120 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-txx64" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.578548 4790 generic.go:334] "Generic (PLEG): container finished" podID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerID="85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.578633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580625 4790 generic.go:334] "Generic (PLEG): container finished" podID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" exitCode=0 Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580656 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580676 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-672cv" event={"ID":"dbee8a79-e625-49ef-8fcb-944341ae6e37","Type":"ContainerDied","Data":"073e407a9eaa46913e8a833719c1712b0b191e45db2255328c3b799329f32f02"} Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.580740 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-672cv" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.586601 4790 scope.go:117] "RemoveContainer" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.587333 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010\": container with ID starting with ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010 not found: ID does not exist" containerID="ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.587387 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010"} err="failed to get container status \"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010\": rpc error: code = NotFound desc = could not find container \"ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010\": container with ID starting with ca6171503f40ceed13aaa534a35717adcebed3a5144de85cb7676739a6296010 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.587415 4790 scope.go:117] "RemoveContainer" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.613316 4790 scope.go:117] "RemoveContainer" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.615021 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.619345 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.642359 4790 scope.go:117] "RemoveContainer" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.642812 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") pod \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.642888 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") pod \"7080e6b3-5934-4c2c-9361-23d20b5a495e\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.643799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "53c38463-b7c5-42c8-a447-7d0e7f190aa9" (UID: "53c38463-b7c5-42c8-a447-7d0e7f190aa9"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644101 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") pod \"7080e6b3-5934-4c2c-9361-23d20b5a495e\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644164 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") pod \"dbee8a79-e625-49ef-8fcb-944341ae6e37\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") pod \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") pod \"dbee8a79-e625-49ef-8fcb-944341ae6e37\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644267 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") pod \"dbee8a79-e625-49ef-8fcb-944341ae6e37\" (UID: \"dbee8a79-e625-49ef-8fcb-944341ae6e37\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644303 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") pod \"7080e6b3-5934-4c2c-9361-23d20b5a495e\" (UID: \"7080e6b3-5934-4c2c-9361-23d20b5a495e\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644329 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") pod \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\" (UID: \"53c38463-b7c5-42c8-a447-7d0e7f190aa9\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644701 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.644983 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities" (OuterVolumeSpecName: "utilities") pod "dbee8a79-e625-49ef-8fcb-944341ae6e37" (UID: "dbee8a79-e625-49ef-8fcb-944341ae6e37"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.648573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp" (OuterVolumeSpecName: "kube-api-access-zhkbp") pod "dbee8a79-e625-49ef-8fcb-944341ae6e37" (UID: "dbee8a79-e625-49ef-8fcb-944341ae6e37"). InnerVolumeSpecName "kube-api-access-zhkbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.648927 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities" (OuterVolumeSpecName: "utilities") pod "7080e6b3-5934-4c2c-9361-23d20b5a495e" (UID: "7080e6b3-5934-4c2c-9361-23d20b5a495e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.655141 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "53c38463-b7c5-42c8-a447-7d0e7f190aa9" (UID: "53c38463-b7c5-42c8-a447-7d0e7f190aa9"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.655525 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct" (OuterVolumeSpecName: "kube-api-access-hskct") pod "7080e6b3-5934-4c2c-9361-23d20b5a495e" (UID: "7080e6b3-5934-4c2c-9361-23d20b5a495e"). InnerVolumeSpecName "kube-api-access-hskct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.655614 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl" (OuterVolumeSpecName: "kube-api-access-zfldl") pod "53c38463-b7c5-42c8-a447-7d0e7f190aa9" (UID: "53c38463-b7c5-42c8-a447-7d0e7f190aa9"). InnerVolumeSpecName "kube-api-access-zfldl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.675888 4790 scope.go:117] "RemoveContainer" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.676390 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723\": container with ID starting with e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723 not found: ID does not exist" containerID="e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676429 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723"} err="failed to get container status \"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723\": rpc error: code = NotFound desc = could not find container \"e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723\": container with ID starting with e10a350599a337024da4df4d724f8dae1ffc815e17e088500d51b914e7fbb723 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676457 4790 scope.go:117] "RemoveContainer" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.676770 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73\": container with ID starting with 4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73 not found: ID does not exist" containerID="4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676801 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73"} err="failed to get container status \"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73\": rpc error: code = NotFound desc = could not find container \"4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73\": container with ID starting with 4370e86c98a9de03c1ac0f85379114290ac05dc88ded272c2ca9ced9f165ce73 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.676819 4790 scope.go:117] "RemoveContainer" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.677098 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566\": container with ID starting with 37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566 not found: ID does not exist" containerID="37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.677128 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566"} err="failed to get container status \"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566\": rpc error: code = NotFound desc = could not find container \"37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566\": container with ID starting with 37f1fa4e4095d22491db4f81d70f3406da6fafc539527a21b3dba5846164e566 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.677149 4790 scope.go:117] "RemoveContainer" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.698182 4790 scope.go:117] "RemoveContainer" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.710718 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dbee8a79-e625-49ef-8fcb-944341ae6e37" (UID: "dbee8a79-e625-49ef-8fcb-944341ae6e37"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.714341 4790 scope.go:117] "RemoveContainer" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.714926 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7080e6b3-5934-4c2c-9361-23d20b5a495e" (UID: "7080e6b3-5934-4c2c-9361-23d20b5a495e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.728894 4790 scope.go:117] "RemoveContainer" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.729276 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88\": container with ID starting with cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88 not found: ID does not exist" containerID="cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729308 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88"} err="failed to get container status \"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88\": rpc error: code = NotFound desc = could not find container \"cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88\": container with ID starting with cd207ce82ced87fdafc05394ca2a86e862e8d9217c17b8cddd7abb0bca23bd88 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729327 4790 scope.go:117] "RemoveContainer" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.729666 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91\": container with ID starting with 3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91 not found: ID does not exist" containerID="3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729687 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91"} err="failed to get container status \"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91\": rpc error: code = NotFound desc = could not find container \"3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91\": container with ID starting with 3a3703bb9c49d2204814c4b8d5e3414b03bd6a68f2376f589235d94599b77a91 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.729700 4790 scope.go:117] "RemoveContainer" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" Mar 13 20:35:24 crc kubenswrapper[4790]: E0313 20:35:24.730056 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975\": container with ID starting with 721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975 not found: ID does not exist" containerID="721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.730117 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975"} err="failed to get container status \"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975\": rpc error: code = NotFound desc = could not find container \"721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975\": container with ID starting with 721e8d71cffd6022d21d74d5c95c4b0f3755ad66a2257e8a4590088e187a7975 not found: ID does not exist" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") pod \"36d32cb2-55c9-48cc-9376-66231ae66f8a\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745703 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") pod \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745746 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") pod \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745805 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") pod \"36d32cb2-55c9-48cc-9376-66231ae66f8a\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745859 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") pod \"36d32cb2-55c9-48cc-9376-66231ae66f8a\" (UID: \"36d32cb2-55c9-48cc-9376-66231ae66f8a\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.745885 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") pod \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\" (UID: \"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c\") " Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746328 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hskct\" (UniqueName: \"kubernetes.io/projected/7080e6b3-5934-4c2c-9361-23d20b5a495e-kube-api-access-hskct\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746349 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746360 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746389 4790 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/53c38463-b7c5-42c8-a447-7d0e7f190aa9-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746401 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbee8a79-e625-49ef-8fcb-944341ae6e37-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746412 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhkbp\" (UniqueName: \"kubernetes.io/projected/dbee8a79-e625-49ef-8fcb-944341ae6e37-kube-api-access-zhkbp\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746422 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7080e6b3-5934-4c2c-9361-23d20b5a495e-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.746431 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zfldl\" (UniqueName: \"kubernetes.io/projected/53c38463-b7c5-42c8-a447-7d0e7f190aa9-kube-api-access-zfldl\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.748438 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities" (OuterVolumeSpecName: "utilities") pod "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" (UID: "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.748860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities" (OuterVolumeSpecName: "utilities") pod "36d32cb2-55c9-48cc-9376-66231ae66f8a" (UID: "36d32cb2-55c9-48cc-9376-66231ae66f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.749781 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw" (OuterVolumeSpecName: "kube-api-access-7vdxw") pod "36d32cb2-55c9-48cc-9376-66231ae66f8a" (UID: "36d32cb2-55c9-48cc-9376-66231ae66f8a"). InnerVolumeSpecName "kube-api-access-7vdxw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.751090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8" (OuterVolumeSpecName: "kube-api-access-xbqp8") pod "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" (UID: "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c"). InnerVolumeSpecName "kube-api-access-xbqp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.772642 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" (UID: "e17d5bd1-f368-47a4-80cb-3bd3eb4b822c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847269 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbqp8\" (UniqueName: \"kubernetes.io/projected/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-kube-api-access-xbqp8\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847297 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vdxw\" (UniqueName: \"kubernetes.io/projected/36d32cb2-55c9-48cc-9376-66231ae66f8a-kube-api-access-7vdxw\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847307 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847317 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.847327 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.882935 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-n548b"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.887733 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36d32cb2-55c9-48cc-9376-66231ae66f8a" (UID: "36d32cb2-55c9-48cc-9376-66231ae66f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.897473 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.897533 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-jnbzb"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.915604 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.919281 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-672cv"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.933586 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.937739 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-txx64"] Mar 13 20:35:24 crc kubenswrapper[4790]: I0313 20:35:24.948642 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36d32cb2-55c9-48cc-9376-66231ae66f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.592349 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bq4pj" event={"ID":"e17d5bd1-f368-47a4-80cb-3bd3eb4b822c","Type":"ContainerDied","Data":"48ce1cd0515d2f72905d7c3b45c89c2baec4ecf2f36741a13ea570b7bf830ee2"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.592730 4790 scope.go:117] "RemoveContainer" containerID="674b4b30c55e5b326d6218ed4dd61e880c35ab5aace228b74177c0e6379905ee" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.592437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bq4pj" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.596524 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" event={"ID":"97fe66e8-7366-4c61-b1db-4d98459834da","Type":"ContainerStarted","Data":"4411b6997bd2d48e537ea3015a3a92b116e5bb55ef646ff60a8f03599e1dd656"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.596644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.596718 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" event={"ID":"97fe66e8-7366-4c61-b1db-4d98459834da","Type":"ContainerStarted","Data":"241fd60f722d811433f2a0a1db304b3b11e77ad867ee343ee1aec7da09239dcf"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.601126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-hnd2l" event={"ID":"36d32cb2-55c9-48cc-9376-66231ae66f8a","Type":"ContainerDied","Data":"5433561752fd3b8f83751ddd33926ccfe479acc64fdf830adcad528290d813de"} Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.601452 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-hnd2l" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.602208 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.615425 4790 scope.go:117] "RemoveContainer" containerID="fb06926f483f81716d03c8b9371fdea2581fe7126069171b7e5648810c33b206" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.624226 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-n548b" podStartSLOduration=1.624179238 podStartE2EDuration="1.624179238s" podCreationTimestamp="2026-03-13 20:35:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:35:25.619619161 +0000 UTC m=+456.640735052" watchObservedRunningTime="2026-03-13 20:35:25.624179238 +0000 UTC m=+456.645295129" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.645678 4790 scope.go:117] "RemoveContainer" containerID="81d571ea6f444235cc217ca2f76bd3ade803e952dcea7fa197b363c62b207fc9" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.685284 4790 scope.go:117] "RemoveContainer" containerID="85bd59b87e1f4b58047275cf65a277b9c79fac88d40c0b516ac9852cc7b0c0af" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.686231 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" path="/var/lib/kubelet/pods/53c38463-b7c5-42c8-a447-7d0e7f190aa9/volumes" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.687731 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" path="/var/lib/kubelet/pods/7080e6b3-5934-4c2c-9361-23d20b5a495e/volumes" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.698909 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" path="/var/lib/kubelet/pods/dbee8a79-e625-49ef-8fcb-944341ae6e37/volumes" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699780 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699831 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bq4pj"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699854 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.699871 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-hnd2l"] Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.705428 4790 scope.go:117] "RemoveContainer" containerID="73d3471f670ba4404f090445863d367e893e2298e86dde9160ee12a7e04a36a6" Mar 13 20:35:25 crc kubenswrapper[4790]: I0313 20:35:25.721800 4790 scope.go:117] "RemoveContainer" containerID="4ccfbd25425ce912c32c0f73aa49b376929e5a036b5718d87d565520eab1f4ab" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.044799 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q5brt"] Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045754 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045790 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045804 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045813 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045822 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045832 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045844 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045852 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045864 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045872 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045883 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045892 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045906 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045916 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045928 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045936 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="extract-utilities" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045946 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045955 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045968 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045976 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.045990 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.045998 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.046013 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046021 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="extract-content" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.046033 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046041 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046151 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbee8a79-e625-49ef-8fcb-944341ae6e37" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046166 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046175 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7080e6b3-5934-4c2c-9361-23d20b5a495e" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046191 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" containerName="registry-server" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.046206 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="53c38463-b7c5-42c8-a447-7d0e7f190aa9" containerName="marketplace-operator" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.047114 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.049626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.052086 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5brt"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.169976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-utilities\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.170026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-catalog-content\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.170064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdh84\" (UniqueName: \"kubernetes.io/projected/9e374399-85bd-4121-9352-23a37bdf41f3-kube-api-access-cdh84\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.194119 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" containerID="cri-o://1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" gracePeriod=30 Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.271398 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-utilities\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.271446 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-catalog-content\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.271485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdh84\" (UniqueName: \"kubernetes.io/projected/9e374399-85bd-4121-9352-23a37bdf41f3-kube-api-access-cdh84\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.272152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-catalog-content\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.272156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9e374399-85bd-4121-9352-23a37bdf41f3-utilities\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.288814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdh84\" (UniqueName: \"kubernetes.io/projected/9e374399-85bd-4121-9352-23a37bdf41f3-kube-api-access-cdh84\") pod \"certified-operators-q5brt\" (UID: \"9e374399-85bd-4121-9352-23a37bdf41f3\") " pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.365772 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.571395 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.572253 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q5brt"] Mar 13 20:35:26 crc kubenswrapper[4790]: W0313 20:35:26.575960 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e374399_85bd_4121_9352_23a37bdf41f3.slice/crio-27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7 WatchSource:0}: Error finding container 27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7: Status 404 returned error can't find the container with id 27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7 Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.609248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerStarted","Data":"27a2b34e9f755b35c2ccc2f3d2b1c882d91d4c65a3a98d843d66190259160dd7"} Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611169 4790 generic.go:334] "Generic (PLEG): container finished" podID="81949470-5c0d-4294-8618-d6ee14da1d41" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" exitCode=0 Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611456 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611479 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerDied","Data":"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb"} Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-vqdfm" event={"ID":"81949470-5c0d-4294-8618-d6ee14da1d41","Type":"ContainerDied","Data":"403adc19adb7ec63c9d90ee6fa3c1500a5901074edab6bc1faa1e7eed14336b6"} Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.611564 4790 scope.go:117] "RemoveContainer" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.643713 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-zhllj"] Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.647055 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.647191 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.647403 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" containerName="registry" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.649652 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.651009 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhllj"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.651171 4790 scope.go:117] "RemoveContainer" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.651691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 13 20:35:26 crc kubenswrapper[4790]: E0313 20:35:26.652505 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb\": container with ID starting with 1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb not found: ID does not exist" containerID="1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.652531 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb"} err="failed to get container status \"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb\": rpc error: code = NotFound desc = could not find container \"1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb\": container with ID starting with 1a8a33812cb2e3b7aa735e1079ec9285f4a63e7f16e0cd92d97609c34a16eddb not found: ID does not exist" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677144 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677212 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677259 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677288 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677330 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677352 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677541 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"81949470-5c0d-4294-8618-d6ee14da1d41\" (UID: \"81949470-5c0d-4294-8618-d6ee14da1d41\") " Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.677966 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.679855 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687407 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687604 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8" (OuterVolumeSpecName: "kube-api-access-zf4v8") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "kube-api-access-zf4v8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.687649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.697782 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.703252 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "81949470-5c0d-4294-8618-d6ee14da1d41" (UID: "81949470-5c0d-4294-8618-d6ee14da1d41"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-catalog-content\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778832 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-utilities\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/0af82375-cffb-4861-82d2-5f1a0e4a8496-kube-api-access-rktt5\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778955 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778968 4790 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/81949470-5c0d-4294-8618-d6ee14da1d41-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778978 4790 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778987 4790 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/81949470-5c0d-4294-8618-d6ee14da1d41-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.778995 4790 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/81949470-5c0d-4294-8618-d6ee14da1d41-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.779003 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zf4v8\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-kube-api-access-zf4v8\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.779011 4790 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81949470-5c0d-4294-8618-d6ee14da1d41-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.880598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/0af82375-cffb-4861-82d2-5f1a0e4a8496-kube-api-access-rktt5\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881009 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-catalog-content\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881034 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-utilities\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881475 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-catalog-content\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.881739 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0af82375-cffb-4861-82d2-5f1a0e4a8496-utilities\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.896179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktt5\" (UniqueName: \"kubernetes.io/projected/0af82375-cffb-4861-82d2-5f1a0e4a8496-kube-api-access-rktt5\") pod \"redhat-marketplace-zhllj\" (UID: \"0af82375-cffb-4861-82d2-5f1a0e4a8496\") " pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.943266 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.947519 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-vqdfm"] Mar 13 20:35:26 crc kubenswrapper[4790]: I0313 20:35:26.978093 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.175773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-zhllj"] Mar 13 20:35:27 crc kubenswrapper[4790]: W0313 20:35:27.183295 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0af82375_cffb_4861_82d2_5f1a0e4a8496.slice/crio-2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45 WatchSource:0}: Error finding container 2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45: Status 404 returned error can't find the container with id 2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45 Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.619961 4790 generic.go:334] "Generic (PLEG): container finished" podID="0af82375-cffb-4861-82d2-5f1a0e4a8496" containerID="8d1a6ce11ec2379dc5ac63c4b98e51824c8d3e9f6f2f802ee9d1593ae4100871" exitCode=0 Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.620039 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerDied","Data":"8d1a6ce11ec2379dc5ac63c4b98e51824c8d3e9f6f2f802ee9d1593ae4100871"} Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.620110 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerStarted","Data":"2b780374c919c4b0fc2b00a707ef7e290af83206af862f66634973ad1bbfec45"} Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.622532 4790 generic.go:334] "Generic (PLEG): container finished" podID="9e374399-85bd-4121-9352-23a37bdf41f3" containerID="9bb0f14a40c31d619fccd1d6803a86e518927862e7ce130d3b5bf77d32f80c8e" exitCode=0 Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.622837 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerDied","Data":"9bb0f14a40c31d619fccd1d6803a86e518927862e7ce130d3b5bf77d32f80c8e"} Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.671848 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d32cb2-55c9-48cc-9376-66231ae66f8a" path="/var/lib/kubelet/pods/36d32cb2-55c9-48cc-9376-66231ae66f8a/volumes" Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.672719 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81949470-5c0d-4294-8618-d6ee14da1d41" path="/var/lib/kubelet/pods/81949470-5c0d-4294-8618-d6ee14da1d41/volumes" Mar 13 20:35:27 crc kubenswrapper[4790]: I0313 20:35:27.673742 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e17d5bd1-f368-47a4-80cb-3bd3eb4b822c" path="/var/lib/kubelet/pods/e17d5bd1-f368-47a4-80cb-3bd3eb4b822c/volumes" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.442466 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b7b2s"] Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.443971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.449832 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.457117 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7b2s"] Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.615158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-utilities\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.615231 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx98v\" (UniqueName: \"kubernetes.io/projected/5ad984b4-e6a7-4559-99e4-02a03eda6303-kube-api-access-fx98v\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.615267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-catalog-content\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716332 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-utilities\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx98v\" (UniqueName: \"kubernetes.io/projected/5ad984b4-e6a7-4559-99e4-02a03eda6303-kube-api-access-fx98v\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716464 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-catalog-content\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.716976 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-catalog-content\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.717168 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ad984b4-e6a7-4559-99e4-02a03eda6303-utilities\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.747467 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx98v\" (UniqueName: \"kubernetes.io/projected/5ad984b4-e6a7-4559-99e4-02a03eda6303-kube-api-access-fx98v\") pod \"redhat-operators-b7b2s\" (UID: \"5ad984b4-e6a7-4559-99e4-02a03eda6303\") " pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:28 crc kubenswrapper[4790]: I0313 20:35:28.763123 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.047240 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.048433 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.052465 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.067844 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.171284 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7b2s"] Mar 13 20:35:29 crc kubenswrapper[4790]: W0313 20:35:29.207084 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ad984b4_e6a7_4559_99e4_02a03eda6303.slice/crio-867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450 WatchSource:0}: Error finding container 867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450: Status 404 returned error can't find the container with id 867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.223257 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.223397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.223443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.324206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.324276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.324329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.325092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.325165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.345575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"community-operators-cpxlj\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.372018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.652891 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ad984b4-e6a7-4559-99e4-02a03eda6303" containerID="1ea94cf34970da34b9edd56b52883436432d0846e4fd98c5242e8a483a82e44d" exitCode=0 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.652960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerDied","Data":"1ea94cf34970da34b9edd56b52883436432d0846e4fd98c5242e8a483a82e44d"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.653205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerStarted","Data":"867c8dce7491b555b064cff3826eb0e008ae4c6a9a2da5a4bb83919dc9c50450"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.657697 4790 generic.go:334] "Generic (PLEG): container finished" podID="0af82375-cffb-4861-82d2-5f1a0e4a8496" containerID="34d3cba630965a127d3ad6e5a44b41ae0df1de3961f5bc2e0d603f3d76f2f11a" exitCode=0 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.657791 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerDied","Data":"34d3cba630965a127d3ad6e5a44b41ae0df1de3961f5bc2e0d603f3d76f2f11a"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.660498 4790 generic.go:334] "Generic (PLEG): container finished" podID="9e374399-85bd-4121-9352-23a37bdf41f3" containerID="8e432a91687e5677ab3de20e9449f793af10ac2bdba711d5b20cace3b9558a25" exitCode=0 Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.668134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerDied","Data":"8e432a91687e5677ab3de20e9449f793af10ac2bdba711d5b20cace3b9558a25"} Mar 13 20:35:29 crc kubenswrapper[4790]: I0313 20:35:29.744953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.666654 4790 generic.go:334] "Generic (PLEG): container finished" podID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" exitCode=0 Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.666733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.667013 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerStarted","Data":"abfa15f6de4daed047e18e5a602cd0577d104072963eda4b67a1d006df7fb930"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.669642 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q5brt" event={"ID":"9e374399-85bd-4121-9352-23a37bdf41f3","Type":"ContainerStarted","Data":"7bb0abea474f4c7b73c8dae1ac16fd59304353b192910368f5641e4fd30c5921"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.672269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-zhllj" event={"ID":"0af82375-cffb-4861-82d2-5f1a0e4a8496","Type":"ContainerStarted","Data":"790b4f0b0842a0325ae2abf506443c4d02d22849427a074f57e3867741930deb"} Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.712250 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q5brt" podStartSLOduration=2.202912698 podStartE2EDuration="4.712231964s" podCreationTimestamp="2026-03-13 20:35:26 +0000 UTC" firstStartedPulling="2026-03-13 20:35:27.624977275 +0000 UTC m=+458.646093186" lastFinishedPulling="2026-03-13 20:35:30.134296521 +0000 UTC m=+461.155412452" observedRunningTime="2026-03-13 20:35:30.709538899 +0000 UTC m=+461.730654800" watchObservedRunningTime="2026-03-13 20:35:30.712231964 +0000 UTC m=+461.733347875" Mar 13 20:35:30 crc kubenswrapper[4790]: I0313 20:35:30.732870 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-zhllj" podStartSLOduration=2.293901086 podStartE2EDuration="4.73285099s" podCreationTimestamp="2026-03-13 20:35:26 +0000 UTC" firstStartedPulling="2026-03-13 20:35:27.620989214 +0000 UTC m=+458.642105125" lastFinishedPulling="2026-03-13 20:35:30.059939138 +0000 UTC m=+461.081055029" observedRunningTime="2026-03-13 20:35:30.729845366 +0000 UTC m=+461.750961267" watchObservedRunningTime="2026-03-13 20:35:30.73285099 +0000 UTC m=+461.753966881" Mar 13 20:35:31 crc kubenswrapper[4790]: I0313 20:35:31.678811 4790 generic.go:334] "Generic (PLEG): container finished" podID="5ad984b4-e6a7-4559-99e4-02a03eda6303" containerID="4a73ccfcfd60d95f2e67e112f884ac61807932beb8afc3b52f8edbee5ed0f550" exitCode=0 Mar 13 20:35:31 crc kubenswrapper[4790]: I0313 20:35:31.678843 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerDied","Data":"4a73ccfcfd60d95f2e67e112f884ac61807932beb8afc3b52f8edbee5ed0f550"} Mar 13 20:35:32 crc kubenswrapper[4790]: I0313 20:35:32.687889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7b2s" event={"ID":"5ad984b4-e6a7-4559-99e4-02a03eda6303","Type":"ContainerStarted","Data":"537365b611324b431c87de833a1023e94baaefae81b46870a2ccc17d17c38283"} Mar 13 20:35:32 crc kubenswrapper[4790]: I0313 20:35:32.689994 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerStarted","Data":"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2"} Mar 13 20:35:32 crc kubenswrapper[4790]: I0313 20:35:32.720280 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b7b2s" podStartSLOduration=2.342240378 podStartE2EDuration="4.720254423s" podCreationTimestamp="2026-03-13 20:35:28 +0000 UTC" firstStartedPulling="2026-03-13 20:35:29.655426698 +0000 UTC m=+460.676542589" lastFinishedPulling="2026-03-13 20:35:32.033440743 +0000 UTC m=+463.054556634" observedRunningTime="2026-03-13 20:35:32.707899319 +0000 UTC m=+463.729015230" watchObservedRunningTime="2026-03-13 20:35:32.720254423 +0000 UTC m=+463.741370324" Mar 13 20:35:33 crc kubenswrapper[4790]: I0313 20:35:33.709142 4790 generic.go:334] "Generic (PLEG): container finished" podID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" exitCode=0 Mar 13 20:35:33 crc kubenswrapper[4790]: I0313 20:35:33.709269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2"} Mar 13 20:35:34 crc kubenswrapper[4790]: I0313 20:35:34.716300 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerStarted","Data":"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5"} Mar 13 20:35:34 crc kubenswrapper[4790]: I0313 20:35:34.733544 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cpxlj" podStartSLOduration=2.237924983 podStartE2EDuration="5.733527078s" podCreationTimestamp="2026-03-13 20:35:29 +0000 UTC" firstStartedPulling="2026-03-13 20:35:30.669448182 +0000 UTC m=+461.690564073" lastFinishedPulling="2026-03-13 20:35:34.165050267 +0000 UTC m=+465.186166168" observedRunningTime="2026-03-13 20:35:34.731404029 +0000 UTC m=+465.752519930" watchObservedRunningTime="2026-03-13 20:35:34.733527078 +0000 UTC m=+465.754642969" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.366688 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.366750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.435199 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.770287 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q5brt" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.978489 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:36 crc kubenswrapper[4790]: I0313 20:35:36.978781 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:37 crc kubenswrapper[4790]: I0313 20:35:37.015921 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:37 crc kubenswrapper[4790]: I0313 20:35:37.771793 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-zhllj" Mar 13 20:35:38 crc kubenswrapper[4790]: I0313 20:35:38.763592 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:38 crc kubenswrapper[4790]: I0313 20:35:38.763644 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.373203 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.373269 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.415865 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.783564 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 20:35:39 crc kubenswrapper[4790]: I0313 20:35:39.804932 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b7b2s" podUID="5ad984b4-e6a7-4559-99e4-02a03eda6303" containerName="registry-server" probeResult="failure" output=< Mar 13 20:35:39 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:35:39 crc kubenswrapper[4790]: > Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.015969 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016294 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016337 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016827 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.016869 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6" gracePeriod=600 Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770446 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6" exitCode=0 Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6"} Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770714 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4"} Mar 13 20:35:44 crc kubenswrapper[4790]: I0313 20:35:44.770735 4790 scope.go:117] "RemoveContainer" containerID="a6707965cde5c2a45c65a034519d863ec0545443a29f4ac7f60d7d01e4e55400" Mar 13 20:35:48 crc kubenswrapper[4790]: I0313 20:35:48.820019 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:35:48 crc kubenswrapper[4790]: I0313 20:35:48.871304 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b7b2s" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.134069 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.135302 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.138118 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.138199 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.138217 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.141109 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.198613 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"auto-csr-approver-29557236-tczbl\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.300095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"auto-csr-approver-29557236-tczbl\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.317632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"auto-csr-approver-29557236-tczbl\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.491481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.680843 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:36:00 crc kubenswrapper[4790]: I0313 20:36:00.868336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerStarted","Data":"ced10b9e5181bbae5848dbc4fffc41ceb2a125517a22f0d199aac485af29d451"} Mar 13 20:36:02 crc kubenswrapper[4790]: I0313 20:36:02.881564 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerStarted","Data":"51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258"} Mar 13 20:36:02 crc kubenswrapper[4790]: I0313 20:36:02.902500 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557236-tczbl" podStartSLOduration=1.069087164 podStartE2EDuration="2.902473203s" podCreationTimestamp="2026-03-13 20:36:00 +0000 UTC" firstStartedPulling="2026-03-13 20:36:00.683313288 +0000 UTC m=+491.704429179" lastFinishedPulling="2026-03-13 20:36:02.516699327 +0000 UTC m=+493.537815218" observedRunningTime="2026-03-13 20:36:02.896793794 +0000 UTC m=+493.917909725" watchObservedRunningTime="2026-03-13 20:36:02.902473203 +0000 UTC m=+493.923589134" Mar 13 20:36:03 crc kubenswrapper[4790]: I0313 20:36:03.888082 4790 generic.go:334] "Generic (PLEG): container finished" podID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerID="51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258" exitCode=0 Mar 13 20:36:03 crc kubenswrapper[4790]: I0313 20:36:03.888145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerDied","Data":"51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258"} Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.127169 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.160830 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") pod \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\" (UID: \"43b65fb5-f36b-4fae-ba13-03b5c81d1639\") " Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.167256 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s" (OuterVolumeSpecName: "kube-api-access-gg82s") pod "43b65fb5-f36b-4fae-ba13-03b5c81d1639" (UID: "43b65fb5-f36b-4fae-ba13-03b5c81d1639"). InnerVolumeSpecName "kube-api-access-gg82s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.262305 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg82s\" (UniqueName: \"kubernetes.io/projected/43b65fb5-f36b-4fae-ba13-03b5c81d1639-kube-api-access-gg82s\") on node \"crc\" DevicePath \"\"" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.900289 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557236-tczbl" event={"ID":"43b65fb5-f36b-4fae-ba13-03b5c81d1639","Type":"ContainerDied","Data":"ced10b9e5181bbae5848dbc4fffc41ceb2a125517a22f0d199aac485af29d451"} Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.900324 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ced10b9e5181bbae5848dbc4fffc41ceb2a125517a22f0d199aac485af29d451" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.900368 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557236-tczbl" Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.947779 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:36:05 crc kubenswrapper[4790]: I0313 20:36:05.952741 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557230-8pqh8"] Mar 13 20:36:07 crc kubenswrapper[4790]: I0313 20:36:07.665971 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d598b7c0-7c77-4903-9138-d8a3d01f9efe" path="/var/lib/kubelet/pods/d598b7c0-7c77-4903-9138-d8a3d01f9efe/volumes" Mar 13 20:37:28 crc kubenswrapper[4790]: I0313 20:37:28.455123 4790 scope.go:117] "RemoveContainer" containerID="4dce60806026c2e057eacfafdb9eb0bcee1204f32aecb7bffa715ddddc59e383" Mar 13 20:37:44 crc kubenswrapper[4790]: I0313 20:37:44.016773 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:37:44 crc kubenswrapper[4790]: I0313 20:37:44.017250 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.135764 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:38:00 crc kubenswrapper[4790]: E0313 20:38:00.136435 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerName="oc" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.136450 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerName="oc" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.136586 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" containerName="oc" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.136995 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.141113 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.142864 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.144063 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.150190 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.191568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"auto-csr-approver-29557238-jx8wj\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.292931 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"auto-csr-approver-29557238-jx8wj\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.311006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"auto-csr-approver-29557238-jx8wj\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.502092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.697961 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:38:00 crc kubenswrapper[4790]: I0313 20:38:00.710100 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:38:01 crc kubenswrapper[4790]: I0313 20:38:01.632365 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" event={"ID":"6c63bf97-e702-439a-8f3b-58d4496c91b9","Type":"ContainerStarted","Data":"18aff9ce0a963102ddae683328354fb98941cf77a7a279bb1519f12a72af6599"} Mar 13 20:38:02 crc kubenswrapper[4790]: I0313 20:38:02.640783 4790 generic.go:334] "Generic (PLEG): container finished" podID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerID="a1eeddc06106c1113c4a31e23128dada69c832330fa1711ed5544055f1b4392f" exitCode=0 Mar 13 20:38:02 crc kubenswrapper[4790]: I0313 20:38:02.640983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" event={"ID":"6c63bf97-e702-439a-8f3b-58d4496c91b9","Type":"ContainerDied","Data":"a1eeddc06106c1113c4a31e23128dada69c832330fa1711ed5544055f1b4392f"} Mar 13 20:38:03 crc kubenswrapper[4790]: I0313 20:38:03.871644 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.047014 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") pod \"6c63bf97-e702-439a-8f3b-58d4496c91b9\" (UID: \"6c63bf97-e702-439a-8f3b-58d4496c91b9\") " Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.073240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk" (OuterVolumeSpecName: "kube-api-access-8rhdk") pod "6c63bf97-e702-439a-8f3b-58d4496c91b9" (UID: "6c63bf97-e702-439a-8f3b-58d4496c91b9"). InnerVolumeSpecName "kube-api-access-8rhdk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.148617 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rhdk\" (UniqueName: \"kubernetes.io/projected/6c63bf97-e702-439a-8f3b-58d4496c91b9-kube-api-access-8rhdk\") on node \"crc\" DevicePath \"\"" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.655805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" event={"ID":"6c63bf97-e702-439a-8f3b-58d4496c91b9","Type":"ContainerDied","Data":"18aff9ce0a963102ddae683328354fb98941cf77a7a279bb1519f12a72af6599"} Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.655852 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18aff9ce0a963102ddae683328354fb98941cf77a7a279bb1519f12a72af6599" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.655897 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557238-jx8wj" Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.933992 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:38:04 crc kubenswrapper[4790]: I0313 20:38:04.944757 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557232-bblq8"] Mar 13 20:38:05 crc kubenswrapper[4790]: I0313 20:38:05.667966 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b190462f-7836-44f0-94c0-1311bdf8e550" path="/var/lib/kubelet/pods/b190462f-7836-44f0-94c0-1311bdf8e550/volumes" Mar 13 20:38:14 crc kubenswrapper[4790]: I0313 20:38:14.016020 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:38:14 crc kubenswrapper[4790]: I0313 20:38:14.016721 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.015437 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.016146 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.016241 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.017507 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.017619 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4" gracePeriod=600 Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950312 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4" exitCode=0 Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4"} Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c"} Mar 13 20:38:44 crc kubenswrapper[4790]: I0313 20:38:44.950829 4790 scope.go:117] "RemoveContainer" containerID="88573fd1abdc5f0d1779ca1679bd1333545fafe5b76c1a0f0888a58d27d16db6" Mar 13 20:39:28 crc kubenswrapper[4790]: I0313 20:39:28.521745 4790 scope.go:117] "RemoveContainer" containerID="8f1a4232fe3ee20e22f3a57d7811b303dba4631c6cf2890a09449767842fc5b4" Mar 13 20:39:28 crc kubenswrapper[4790]: I0313 20:39:28.573767 4790 scope.go:117] "RemoveContainer" containerID="7924ab194fb126f41405d7a390a1fb75af9316272755308a5775fdb0f460db4d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.154761 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:40:00 crc kubenswrapper[4790]: E0313 20:40:00.157324 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerName="oc" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.157432 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerName="oc" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.157808 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" containerName="oc" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.158703 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.161232 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.161621 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.162100 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.164502 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.269484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"auto-csr-approver-29557240-8qw5d\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.371695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"auto-csr-approver-29557240-8qw5d\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.406564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"auto-csr-approver-29557240-8qw5d\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.485993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:00 crc kubenswrapper[4790]: I0313 20:40:00.715051 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:40:01 crc kubenswrapper[4790]: I0313 20:40:01.478983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerStarted","Data":"0ab6a1d896fc66193d6078f1d3865ee51f8ae31a5063281d02344bd55f9ed347"} Mar 13 20:40:02 crc kubenswrapper[4790]: I0313 20:40:02.488524 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerStarted","Data":"a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f"} Mar 13 20:40:02 crc kubenswrapper[4790]: I0313 20:40:02.502880 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" podStartSLOduration=1.061056131 podStartE2EDuration="2.502864877s" podCreationTimestamp="2026-03-13 20:40:00 +0000 UTC" firstStartedPulling="2026-03-13 20:40:00.728484933 +0000 UTC m=+731.749600824" lastFinishedPulling="2026-03-13 20:40:02.170293669 +0000 UTC m=+733.191409570" observedRunningTime="2026-03-13 20:40:02.500416273 +0000 UTC m=+733.521532164" watchObservedRunningTime="2026-03-13 20:40:02.502864877 +0000 UTC m=+733.523980758" Mar 13 20:40:03 crc kubenswrapper[4790]: I0313 20:40:03.495256 4790 generic.go:334] "Generic (PLEG): container finished" podID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerID="a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f" exitCode=0 Mar 13 20:40:03 crc kubenswrapper[4790]: I0313 20:40:03.495356 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerDied","Data":"a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f"} Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.703187 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.825738 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") pod \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\" (UID: \"f6f1fa3a-7f88-4e89-bd00-4426798fccce\") " Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.833597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn" (OuterVolumeSpecName: "kube-api-access-kt9fn") pod "f6f1fa3a-7f88-4e89-bd00-4426798fccce" (UID: "f6f1fa3a-7f88-4e89-bd00-4426798fccce"). InnerVolumeSpecName "kube-api-access-kt9fn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:40:04 crc kubenswrapper[4790]: I0313 20:40:04.927805 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt9fn\" (UniqueName: \"kubernetes.io/projected/f6f1fa3a-7f88-4e89-bd00-4426798fccce-kube-api-access-kt9fn\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.511972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" event={"ID":"f6f1fa3a-7f88-4e89-bd00-4426798fccce","Type":"ContainerDied","Data":"0ab6a1d896fc66193d6078f1d3865ee51f8ae31a5063281d02344bd55f9ed347"} Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.512424 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab6a1d896fc66193d6078f1d3865ee51f8ae31a5063281d02344bd55f9ed347" Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.512054 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557240-8qw5d" Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.568565 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.574647 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557234-6g6zh"] Mar 13 20:40:05 crc kubenswrapper[4790]: I0313 20:40:05.673094 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8e0ffa-a21f-4726-8185-2cff61c94b91" path="/var/lib/kubelet/pods/6b8e0ffa-a21f-4726-8185-2cff61c94b91/volumes" Mar 13 20:40:28 crc kubenswrapper[4790]: I0313 20:40:28.661575 4790 scope.go:117] "RemoveContainer" containerID="4a133641d0a543ddd92802af2ba335acfaf29e7ed5636f43383cb7790a817cba" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.483860 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg"] Mar 13 20:40:35 crc kubenswrapper[4790]: E0313 20:40:35.484601 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerName="oc" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.484613 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerName="oc" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.484730 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" containerName="oc" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.485177 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.487425 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.487681 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-5n769" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.487813 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.492104 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-fgq7z"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.492931 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.498579 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-7pmvg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.504573 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.519034 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p4h8t"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.519890 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.524904 4790 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-cns5w" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.529948 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fgq7z"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.539144 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p4h8t"] Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.560221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr4g9\" (UniqueName: \"kubernetes.io/projected/1430c143-e235-49e5-a141-78b9e3297b70-kube-api-access-mr4g9\") pod \"cert-manager-webhook-687f57d79b-p4h8t\" (UID: \"1430c143-e235-49e5-a141-78b9e3297b70\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.560313 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqp64\" (UniqueName: \"kubernetes.io/projected/f58ec868-a42c-463c-b65f-bf118fae6518-kube-api-access-gqp64\") pod \"cert-manager-cainjector-cf98fcc89-vfjwg\" (UID: \"f58ec868-a42c-463c-b65f-bf118fae6518\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.560360 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7rd2\" (UniqueName: \"kubernetes.io/projected/c77372fb-0649-4c32-be4f-34c3dd515246-kube-api-access-r7rd2\") pod \"cert-manager-858654f9db-fgq7z\" (UID: \"c77372fb-0649-4c32-be4f-34c3dd515246\") " pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.662052 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqp64\" (UniqueName: \"kubernetes.io/projected/f58ec868-a42c-463c-b65f-bf118fae6518-kube-api-access-gqp64\") pod \"cert-manager-cainjector-cf98fcc89-vfjwg\" (UID: \"f58ec868-a42c-463c-b65f-bf118fae6518\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.662103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7rd2\" (UniqueName: \"kubernetes.io/projected/c77372fb-0649-4c32-be4f-34c3dd515246-kube-api-access-r7rd2\") pod \"cert-manager-858654f9db-fgq7z\" (UID: \"c77372fb-0649-4c32-be4f-34c3dd515246\") " pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.662178 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr4g9\" (UniqueName: \"kubernetes.io/projected/1430c143-e235-49e5-a141-78b9e3297b70-kube-api-access-mr4g9\") pod \"cert-manager-webhook-687f57d79b-p4h8t\" (UID: \"1430c143-e235-49e5-a141-78b9e3297b70\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.682556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7rd2\" (UniqueName: \"kubernetes.io/projected/c77372fb-0649-4c32-be4f-34c3dd515246-kube-api-access-r7rd2\") pod \"cert-manager-858654f9db-fgq7z\" (UID: \"c77372fb-0649-4c32-be4f-34c3dd515246\") " pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.683622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqp64\" (UniqueName: \"kubernetes.io/projected/f58ec868-a42c-463c-b65f-bf118fae6518-kube-api-access-gqp64\") pod \"cert-manager-cainjector-cf98fcc89-vfjwg\" (UID: \"f58ec868-a42c-463c-b65f-bf118fae6518\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.687677 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr4g9\" (UniqueName: \"kubernetes.io/projected/1430c143-e235-49e5-a141-78b9e3297b70-kube-api-access-mr4g9\") pod \"cert-manager-webhook-687f57d79b-p4h8t\" (UID: \"1430c143-e235-49e5-a141-78b9e3297b70\") " pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.806403 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.822061 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-fgq7z" Mar 13 20:40:35 crc kubenswrapper[4790]: I0313 20:40:35.834669 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.231363 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg"] Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.283641 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-fgq7z"] Mar 13 20:40:36 crc kubenswrapper[4790]: W0313 20:40:36.284214 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc77372fb_0649_4c32_be4f_34c3dd515246.slice/crio-b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499 WatchSource:0}: Error finding container b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499: Status 404 returned error can't find the container with id b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499 Mar 13 20:40:36 crc kubenswrapper[4790]: W0313 20:40:36.285703 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1430c143_e235_49e5_a141_78b9e3297b70.slice/crio-529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565 WatchSource:0}: Error finding container 529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565: Status 404 returned error can't find the container with id 529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565 Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.288347 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-p4h8t"] Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.716147 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" event={"ID":"1430c143-e235-49e5-a141-78b9e3297b70","Type":"ContainerStarted","Data":"529ed46a392eb8f87d5a80551f72f70ed053454c423a9e5e27093a115dc8b565"} Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.717836 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fgq7z" event={"ID":"c77372fb-0649-4c32-be4f-34c3dd515246","Type":"ContainerStarted","Data":"b5011cbba02125d299d3fc004e5a8cbde4229060971dfc309322456720607499"} Mar 13 20:40:36 crc kubenswrapper[4790]: I0313 20:40:36.718952 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" event={"ID":"f58ec868-a42c-463c-b65f-bf118fae6518","Type":"ContainerStarted","Data":"8b003f0a7c04ab13268103f2c0fe33fc373be8d6947436c5e9755e4aeb8d239a"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.745172 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" event={"ID":"f58ec868-a42c-463c-b65f-bf118fae6518","Type":"ContainerStarted","Data":"4bc28f7ed08d9aab402506f9e501d1f4f0a538ac2f7d888937e7cab8ccda1a95"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.746875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" event={"ID":"1430c143-e235-49e5-a141-78b9e3297b70","Type":"ContainerStarted","Data":"abdbe82f6d3c51720a6b25b25557fa1ad4e09a214dcf615f660a0c6dba440acc"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.747500 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.749051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-fgq7z" event={"ID":"c77372fb-0649-4c32-be4f-34c3dd515246","Type":"ContainerStarted","Data":"4bf504403e04e8756565cb5837e470a7f72f492ccef20ce73fad57b8e3b45b46"} Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.767434 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-vfjwg" podStartSLOduration=1.664276987 podStartE2EDuration="5.767419096s" podCreationTimestamp="2026-03-13 20:40:35 +0000 UTC" firstStartedPulling="2026-03-13 20:40:36.243231928 +0000 UTC m=+767.264347819" lastFinishedPulling="2026-03-13 20:40:40.346373997 +0000 UTC m=+771.367489928" observedRunningTime="2026-03-13 20:40:40.760943834 +0000 UTC m=+771.782059725" watchObservedRunningTime="2026-03-13 20:40:40.767419096 +0000 UTC m=+771.788534987" Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.778319 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" podStartSLOduration=1.7246080799999999 podStartE2EDuration="5.778303523s" podCreationTimestamp="2026-03-13 20:40:35 +0000 UTC" firstStartedPulling="2026-03-13 20:40:36.289065949 +0000 UTC m=+767.310181840" lastFinishedPulling="2026-03-13 20:40:40.342761392 +0000 UTC m=+771.363877283" observedRunningTime="2026-03-13 20:40:40.774947784 +0000 UTC m=+771.796063675" watchObservedRunningTime="2026-03-13 20:40:40.778303523 +0000 UTC m=+771.799419414" Mar 13 20:40:40 crc kubenswrapper[4790]: I0313 20:40:40.796352 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-fgq7z" podStartSLOduration=1.668142988 podStartE2EDuration="5.79633436s" podCreationTimestamp="2026-03-13 20:40:35 +0000 UTC" firstStartedPulling="2026-03-13 20:40:36.286718827 +0000 UTC m=+767.307834718" lastFinishedPulling="2026-03-13 20:40:40.414910199 +0000 UTC m=+771.436026090" observedRunningTime="2026-03-13 20:40:40.794884091 +0000 UTC m=+771.815999982" watchObservedRunningTime="2026-03-13 20:40:40.79633436 +0000 UTC m=+771.817450251" Mar 13 20:40:44 crc kubenswrapper[4790]: I0313 20:40:44.015887 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:40:44 crc kubenswrapper[4790]: I0313 20:40:44.015955 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:40:45 crc kubenswrapper[4790]: I0313 20:40:45.837636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-p4h8t" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.015722 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.016767 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" containerID="cri-o://b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017201 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" containerID="cri-o://528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017250 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" containerID="cri-o://5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017293 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" containerID="cri-o://878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017332 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017377 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" containerID="cri-o://8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.017442 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" containerID="cri-o://8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.113735 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" containerID="cri-o://78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" gracePeriod=30 Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.407492 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.409822 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-acl-logging/0.log" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.410428 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-controller/0.log" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.410922 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.465896 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-slnjx"] Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466462 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466480 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466494 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466501 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466512 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kubecfg-setup" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466521 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kubecfg-setup" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466532 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466541 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466557 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466565 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466572 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466582 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466595 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466602 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466611 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466619 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466630 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466637 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466649 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466657 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.466667 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466675 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466819 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-acl-logging" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466831 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466841 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="sbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466853 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovn-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466865 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="nbdb" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466874 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="northd" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466886 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466894 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466903 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.466912 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="kube-rbac-proxy-node" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.467064 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.467076 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: E0313 20:40:55.467092 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.467101 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.467218 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerName="ovnkube-controller" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.469130 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515438 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515516 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515632 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash" (OuterVolumeSpecName: "host-slash") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515734 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515788 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.515971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516005 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516038 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516069 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516105 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516134 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516166 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket" (OuterVolumeSpecName: "log-socket") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516179 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516164 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516201 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log" (OuterVolumeSpecName: "node-log") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516209 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516247 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516250 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516326 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516405 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516460 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516511 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516519 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516584 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516637 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") pod \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\" (UID: \"a0c9dff4-5508-4391-bb03-6710c2b9f3b5\") " Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516681 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516760 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516776 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.516907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-log-socket\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517006 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-var-lib-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517050 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517110 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-netns\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517151 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-slash\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-ovn\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517266 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517307 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-kubelet\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517331 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-netd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517352 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngt5w\" (UniqueName: \"kubernetes.io/projected/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-kube-api-access-ngt5w\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-systemd-units\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517465 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-script-lib\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517515 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-bin\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517549 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-env-overrides\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-systemd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-config\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-node-log\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517715 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-etc-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517795 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovn-node-metrics-cert\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517904 4790 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-log-socket\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517922 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517936 4790 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-node-log\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517950 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517962 4790 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517976 4790 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517987 4790 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.517999 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518011 4790 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518025 4790 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518038 4790 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518051 4790 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518063 4790 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518076 4790 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-slash\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518088 4790 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518100 4790 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.518111 4790 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.521275 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.522267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv" (OuterVolumeSpecName: "kube-api-access-h24bv") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "kube-api-access-h24bv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.530148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "a0c9dff4-5508-4391-bb03-6710c2b9f3b5" (UID: "a0c9dff4-5508-4391-bb03-6710c2b9f3b5"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-kubelet\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-netd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618906 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngt5w\" (UniqueName: \"kubernetes.io/projected/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-kube-api-access-ngt5w\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-systemd-units\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618946 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-script-lib\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-bin\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618990 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-env-overrides\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.618998 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-netd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-systemd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-config\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-node-log\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619132 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-etc-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619157 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovn-node-metrics-cert\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-kubelet\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-log-socket\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619041 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-systemd\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-log-socket\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619254 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-cni-bin\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-var-lib-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619287 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-node-log\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619326 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-netns\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-slash\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619449 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-ovn\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-systemd-units\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619484 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619553 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-etc-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619567 4790 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619615 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619630 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h24bv\" (UniqueName: \"kubernetes.io/projected/a0c9dff4-5508-4391-bb03-6710c2b9f3b5-kube-api-access-h24bv\") on node \"crc\" DevicePath \"\"" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-run-netns\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-var-lib-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619780 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-openvswitch\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619790 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-slash\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.619816 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-run-ovn\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620166 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-env-overrides\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-script-lib\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620377 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovnkube-config\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.620502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.623750 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-ovn-node-metrics-cert\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.637115 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngt5w\" (UniqueName: \"kubernetes.io/projected/53ef6fca-53d7-43a3-8d94-3a29f09cefc7-kube-api-access-ngt5w\") pod \"ovnkube-node-slnjx\" (UID: \"53ef6fca-53d7-43a3-8d94-3a29f09cefc7\") " pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: I0313 20:40:55.786993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:40:55 crc kubenswrapper[4790]: W0313 20:40:55.813497 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53ef6fca_53d7_43a3_8d94_3a29f09cefc7.slice/crio-a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9 WatchSource:0}: Error finding container a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9: Status 404 returned error can't find the container with id a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127148 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/2.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127723 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/1.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127770 4790 generic.go:334] "Generic (PLEG): container finished" podID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" containerID="5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221" exitCode=2 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerDied","Data":"5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.127896 4790 scope.go:117] "RemoveContainer" containerID="9f1f5c4bce1d70f87af694909ff1520e5030abd584b21b0e93f42a9f4328ed9e" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.128443 4790 scope.go:117] "RemoveContainer" containerID="5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.128662 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-x2tjg_openshift-multus(207e7f49-094a-4e59-a8ff-9eacd8d6fe2a)\"" pod="openshift-multus/multus-x2tjg" podUID="207e7f49-094a-4e59-a8ff-9eacd8d6fe2a" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.134861 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovnkube-controller/3.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.137452 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-acl-logging/0.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.137977 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-gz4fj_a0c9dff4-5508-4391-bb03-6710c2b9f3b5/ovn-controller/0.log" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138445 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138530 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138586 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138647 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138702 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138757 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138811 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" exitCode=143 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138867 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" exitCode=143 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138675 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.138518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139060 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139148 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139180 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139190 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139197 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139204 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139211 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139219 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139227 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139234 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139261 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139271 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139282 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139292 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139299 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139307 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139313 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139337 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139345 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139352 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139358 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139366 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139419 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139428 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139435 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139442 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139449 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139456 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139507 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139514 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139521 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139527 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-gz4fj" event={"ID":"a0c9dff4-5508-4391-bb03-6710c2b9f3b5","Type":"ContainerDied","Data":"7c759d9eac24045ee77e532dda62f3a6c5e2ed387c3e9d1e970d8448a87220c0"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139547 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139556 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139594 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139604 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139610 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139617 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139624 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139630 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139636 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.139643 4790 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.141724 4790 generic.go:334] "Generic (PLEG): container finished" podID="53ef6fca-53d7-43a3-8d94-3a29f09cefc7" containerID="e1626ebac9dfac2a9c22f6978706c491f9807c012cee772ed96fdf2a048f10b7" exitCode=0 Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.141852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerDied","Data":"e1626ebac9dfac2a9c22f6978706c491f9807c012cee772ed96fdf2a048f10b7"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.141964 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"a44a7f9cd07ab910e8e72360dca5fb6cd8b5e0b60939dcff80414d537984e1c9"} Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.185905 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.217694 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.221021 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-gz4fj"] Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.235312 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.266793 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.282271 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.301205 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.342889 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.363023 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.384207 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.420873 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.474636 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.490814 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.491317 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491355 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491395 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.491831 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491908 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.491958 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.492471 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.492499 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.492520 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.493072 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493114 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493134 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.493614 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493635 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.493674 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.494009 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494036 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494051 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.494399 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494423 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494440 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.494777 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494802 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.494819 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.495115 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495141 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495158 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: E0313 20:40:56.495540 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495565 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495581 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495866 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.495887 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496236 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496259 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496582 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496602 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496903 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.496931 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497190 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497207 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497593 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497614 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497914 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.497932 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498194 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498211 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498507 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498524 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498816 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.498833 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499185 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499213 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499514 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499532 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499787 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.499803 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500124 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500146 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500512 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500534 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500872 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.500894 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501247 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501271 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501577 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501601 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501937 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.501955 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502270 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502285 4790 scope.go:117] "RemoveContainer" containerID="78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502615 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c"} err="failed to get container status \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": rpc error: code = NotFound desc = could not find container \"78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c\": container with ID starting with 78eb113d3271ecde8479d63b5204be287383170ff22f841a47378d03ebeb474c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502634 4790 scope.go:117] "RemoveContainer" containerID="add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.502992 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f"} err="failed to get container status \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": rpc error: code = NotFound desc = could not find container \"add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f\": container with ID starting with add0d84fe125cc821490a777c7d16aa569eede7394325bf958a57e9f330f464f not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503010 4790 scope.go:117] "RemoveContainer" containerID="528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503266 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd"} err="failed to get container status \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": rpc error: code = NotFound desc = could not find container \"528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd\": container with ID starting with 528364efe6888cb4021898321e37096160566cff03ce73996e61668c2651a2dd not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503284 4790 scope.go:117] "RemoveContainer" containerID="5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503532 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167"} err="failed to get container status \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": rpc error: code = NotFound desc = could not find container \"5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167\": container with ID starting with 5562d6dd87ded1f4f5ebcf07f9cab74ae2d64702837365c3cd102c3c567b7167 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.503547 4790 scope.go:117] "RemoveContainer" containerID="878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504083 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96"} err="failed to get container status \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": rpc error: code = NotFound desc = could not find container \"878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96\": container with ID starting with 878e01e2b84f0e55421cad51bd481ccbf0a4c99223a9ff14214c6458af7faa96 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504102 4790 scope.go:117] "RemoveContainer" containerID="eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504411 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec"} err="failed to get container status \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": rpc error: code = NotFound desc = could not find container \"eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec\": container with ID starting with eb59538c278dd25a673268164a4c0da366677d05ee0f1771dc8c886ffa86c3ec not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504435 4790 scope.go:117] "RemoveContainer" containerID="8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504790 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c"} err="failed to get container status \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": rpc error: code = NotFound desc = could not find container \"8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c\": container with ID starting with 8923b794ec23e9e12adaf74ebba5f449b396be081c25840da21dc736bcc5205c not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.504813 4790 scope.go:117] "RemoveContainer" containerID="8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505115 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453"} err="failed to get container status \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": rpc error: code = NotFound desc = could not find container \"8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453\": container with ID starting with 8af62dcf02b9482111c31af2eed4bfcd241ddfd74d4542d213990530f9e1d453 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505137 4790 scope.go:117] "RemoveContainer" containerID="b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505439 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438"} err="failed to get container status \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": rpc error: code = NotFound desc = could not find container \"b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438\": container with ID starting with b9ae5eba8b7eb1c70b82978666b2ac3ab62b9704317b05f73132151611d8f438 not found: ID does not exist" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505467 4790 scope.go:117] "RemoveContainer" containerID="f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768" Mar 13 20:40:56 crc kubenswrapper[4790]: I0313 20:40:56.505817 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768"} err="failed to get container status \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": rpc error: code = NotFound desc = could not find container \"f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768\": container with ID starting with f9d405eae18f66537bd2586b13d38ce83070a8c587ed17ecf5b0f0dde35b5768 not found: ID does not exist" Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.157683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"6aed25839fda9ef82da8cdf8a54bbb1153e9be0e50ace1d41afa4232a5c3f02d"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158014 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"510339f6f8da757231c4c47aac0c734cc2940ff0578e3fbd62814c9e118ff6b1"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"c406f3dd7ae8fb51d3ec4666101f9955cb3361b5a13f153799ea8b3d2c610d88"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158040 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"14ffe1cd22064fd24c3d3d662fbcc0523d30a139620cc52b7ecee44bebb49956"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"8b30842bff1c3268a3a2d67fa4f2cbb7177a0a9f34737190eb1176e1a2c70080"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.158065 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"081c75b02f7fa77a1d992c1a2b12a291cb3a0bcb515cd3f115037d7250608bfe"} Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.159042 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/2.log" Mar 13 20:40:57 crc kubenswrapper[4790]: I0313 20:40:57.672944 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0c9dff4-5508-4391-bb03-6710c2b9f3b5" path="/var/lib/kubelet/pods/a0c9dff4-5508-4391-bb03-6710c2b9f3b5/volumes" Mar 13 20:41:00 crc kubenswrapper[4790]: I0313 20:41:00.189048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"490c72b57e53c5188ced2cc1e8a30d664b1c561d7d375b1398cdf59179252de3"} Mar 13 20:41:02 crc kubenswrapper[4790]: I0313 20:41:02.208697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" event={"ID":"53ef6fca-53d7-43a3-8d94-3a29f09cefc7","Type":"ContainerStarted","Data":"db83114523f379f590611fc9a77d035b663ae2f250efe670e96ad0e03365e2a2"} Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.215419 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.215997 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.244446 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:03 crc kubenswrapper[4790]: I0313 20:41:03.274937 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" podStartSLOduration=8.274915016 podStartE2EDuration="8.274915016s" podCreationTimestamp="2026-03-13 20:40:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:41:03.248802985 +0000 UTC m=+794.269918886" watchObservedRunningTime="2026-03-13 20:41:03.274915016 +0000 UTC m=+794.296030907" Mar 13 20:41:04 crc kubenswrapper[4790]: I0313 20:41:04.221538 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:04 crc kubenswrapper[4790]: I0313 20:41:04.257084 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:11 crc kubenswrapper[4790]: I0313 20:41:11.659723 4790 scope.go:117] "RemoveContainer" containerID="5a664c8908a82d034ede1821b9b77be44539b262b67dbd487d1b8e0a90a94221" Mar 13 20:41:12 crc kubenswrapper[4790]: I0313 20:41:12.277878 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-x2tjg_207e7f49-094a-4e59-a8ff-9eacd8d6fe2a/kube-multus/2.log" Mar 13 20:41:12 crc kubenswrapper[4790]: I0313 20:41:12.278273 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-x2tjg" event={"ID":"207e7f49-094a-4e59-a8ff-9eacd8d6fe2a","Type":"ContainerStarted","Data":"7d6d3b206a300169a846037d851026e58ef95aff89b8688100fcc3c7cd819164"} Mar 13 20:41:14 crc kubenswrapper[4790]: I0313 20:41:14.015894 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:41:14 crc kubenswrapper[4790]: I0313 20:41:14.016511 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.152575 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v"] Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.154487 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.157744 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v"] Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.158313 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.291431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.291494 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.291518 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392062 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392122 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392589 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.392622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.410004 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.477904 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:22 crc kubenswrapper[4790]: I0313 20:41:22.707953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v"] Mar 13 20:41:22 crc kubenswrapper[4790]: W0313 20:41:22.714678 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16b499fa_d8a4_4f3f_bcaf_aa9fa7b43854.slice/crio-2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3 WatchSource:0}: Error finding container 2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3: Status 404 returned error can't find the container with id 2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3 Mar 13 20:41:23 crc kubenswrapper[4790]: I0313 20:41:23.351077 4790 generic.go:334] "Generic (PLEG): container finished" podID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerID="29db7f843e5eb6932a590b6089f2077fb8134b027aedbf5bd8d88a8ecd0dfd07" exitCode=0 Mar 13 20:41:23 crc kubenswrapper[4790]: I0313 20:41:23.351135 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"29db7f843e5eb6932a590b6089f2077fb8134b027aedbf5bd8d88a8ecd0dfd07"} Mar 13 20:41:23 crc kubenswrapper[4790]: I0313 20:41:23.351163 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerStarted","Data":"2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3"} Mar 13 20:41:25 crc kubenswrapper[4790]: I0313 20:41:25.363626 4790 generic.go:334] "Generic (PLEG): container finished" podID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerID="5c12e9d48e3107cfc1c450549d21e1d27c785d58c90ee901969e43971943f9c1" exitCode=0 Mar 13 20:41:25 crc kubenswrapper[4790]: I0313 20:41:25.363720 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"5c12e9d48e3107cfc1c450549d21e1d27c785d58c90ee901969e43971943f9c1"} Mar 13 20:41:25 crc kubenswrapper[4790]: I0313 20:41:25.818496 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-slnjx" Mar 13 20:41:26 crc kubenswrapper[4790]: I0313 20:41:26.372839 4790 generic.go:334] "Generic (PLEG): container finished" podID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerID="58323afdd50e070f00b267a705c22daed7d2836118e84819cbc88623904dd505" exitCode=0 Mar 13 20:41:26 crc kubenswrapper[4790]: I0313 20:41:26.372884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"58323afdd50e070f00b267a705c22daed7d2836118e84819cbc88623904dd505"} Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.690359 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.783597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") pod \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.819126 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util" (OuterVolumeSpecName: "util") pod "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" (UID: "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.884935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") pod \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.885074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") pod \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\" (UID: \"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854\") " Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.885671 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-util\") on node \"crc\" DevicePath \"\"" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.886262 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle" (OuterVolumeSpecName: "bundle") pod "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" (UID: "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.891493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc" (OuterVolumeSpecName: "kube-api-access-2c7tc") pod "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" (UID: "16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854"). InnerVolumeSpecName "kube-api-access-2c7tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.986479 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c7tc\" (UniqueName: \"kubernetes.io/projected/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-kube-api-access-2c7tc\") on node \"crc\" DevicePath \"\"" Mar 13 20:41:27 crc kubenswrapper[4790]: I0313 20:41:27.986522 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:41:28 crc kubenswrapper[4790]: I0313 20:41:28.399872 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" event={"ID":"16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854","Type":"ContainerDied","Data":"2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3"} Mar 13 20:41:28 crc kubenswrapper[4790]: I0313 20:41:28.399918 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a9b770391618be51bfecf5e5e350aa3ccecdf6e566f58a46e98ada7617319d3" Mar 13 20:41:28 crc kubenswrapper[4790]: I0313 20:41:28.399941 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.009414 4790 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527537 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv"] Mar 13 20:41:33 crc kubenswrapper[4790]: E0313 20:41:33.527748 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="pull" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527761 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="pull" Mar 13 20:41:33 crc kubenswrapper[4790]: E0313 20:41:33.527778 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="extract" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527785 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="extract" Mar 13 20:41:33 crc kubenswrapper[4790]: E0313 20:41:33.527803 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="util" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527810 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="util" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.527902 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854" containerName="extract" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.528283 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.529960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-w2dt4" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.530038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.530522 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.543040 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv"] Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.661069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hnmj\" (UniqueName: \"kubernetes.io/projected/4d5f9755-21a7-482e-8788-85ed86738b40-kube-api-access-7hnmj\") pod \"nmstate-operator-796d4cfff4-4lvtv\" (UID: \"4d5f9755-21a7-482e-8788-85ed86738b40\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.762918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hnmj\" (UniqueName: \"kubernetes.io/projected/4d5f9755-21a7-482e-8788-85ed86738b40-kube-api-access-7hnmj\") pod \"nmstate-operator-796d4cfff4-4lvtv\" (UID: \"4d5f9755-21a7-482e-8788-85ed86738b40\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.781508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hnmj\" (UniqueName: \"kubernetes.io/projected/4d5f9755-21a7-482e-8788-85ed86738b40-kube-api-access-7hnmj\") pod \"nmstate-operator-796d4cfff4-4lvtv\" (UID: \"4d5f9755-21a7-482e-8788-85ed86738b40\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:33 crc kubenswrapper[4790]: I0313 20:41:33.845206 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" Mar 13 20:41:34 crc kubenswrapper[4790]: I0313 20:41:34.283499 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv"] Mar 13 20:41:34 crc kubenswrapper[4790]: I0313 20:41:34.437184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" event={"ID":"4d5f9755-21a7-482e-8788-85ed86738b40","Type":"ContainerStarted","Data":"07a4557833fe001890bdeb37abfe81b6abb4be6a5e5df0e8dfd9dd8354ba3129"} Mar 13 20:41:37 crc kubenswrapper[4790]: I0313 20:41:37.456481 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" event={"ID":"4d5f9755-21a7-482e-8788-85ed86738b40","Type":"ContainerStarted","Data":"9cbf7026d5cf7dc8ace2a5809a91f0f78cd3b97654ae49ad9dced8d2f687e7a5"} Mar 13 20:41:37 crc kubenswrapper[4790]: I0313 20:41:37.492331 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-4lvtv" podStartSLOduration=2.296707132 podStartE2EDuration="4.492316912s" podCreationTimestamp="2026-03-13 20:41:33 +0000 UTC" firstStartedPulling="2026-03-13 20:41:34.294476464 +0000 UTC m=+825.315592355" lastFinishedPulling="2026-03-13 20:41:36.490086244 +0000 UTC m=+827.511202135" observedRunningTime="2026-03-13 20:41:37.489135785 +0000 UTC m=+828.510251676" watchObservedRunningTime="2026-03-13 20:41:37.492316912 +0000 UTC m=+828.513432803" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.304724 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.306070 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: W0313 20:41:42.309568 4790 reflector.go:561] object-"openshift-nmstate"/"nmstate-handler-dockercfg-dbjg2": failed to list *v1.Secret: secrets "nmstate-handler-dockercfg-dbjg2" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-nmstate": no relationship found between node 'crc' and this object Mar 13 20:41:42 crc kubenswrapper[4790]: E0313 20:41:42.309639 4790 reflector.go:158] "Unhandled Error" err="object-\"openshift-nmstate\"/\"nmstate-handler-dockercfg-dbjg2\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"nmstate-handler-dockercfg-dbjg2\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-nmstate\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.355505 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.355588 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qld4w"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.356649 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.358601 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.368160 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-b2697"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.368976 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.390815 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qld4w"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.435178 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.435927 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.437691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.437746 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.437827 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qr5bt" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.450195 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474657 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvzcz\" (UniqueName: \"kubernetes.io/projected/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-kube-api-access-xvzcz\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474718 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4p44\" (UniqueName: \"kubernetes.io/projected/e1a3b709-858c-4bca-b52b-c96dc23d9149-kube-api-access-q4p44\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-ovs-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474797 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbpdg\" (UniqueName: \"kubernetes.io/projected/4295503b-996b-4a20-844b-07a90de225a6-kube-api-access-kbpdg\") pod \"nmstate-metrics-9b8c8685d-wvv95\" (UID: \"4295503b-996b-4a20-844b-07a90de225a6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.474920 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-nmstate-lock\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.475029 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-dbus-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.475064 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e1a3b709-858c-4bca-b52b-c96dc23d9149-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6wq\" (UniqueName: \"kubernetes.io/projected/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-kube-api-access-7s6wq\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-ovs-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576819 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbpdg\" (UniqueName: \"kubernetes.io/projected/4295503b-996b-4a20-844b-07a90de225a6-kube-api-access-kbpdg\") pod \"nmstate-metrics-9b8c8685d-wvv95\" (UID: \"4295503b-996b-4a20-844b-07a90de225a6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576846 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576897 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576907 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-ovs-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-nmstate-lock\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576966 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-nmstate-lock\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.576982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-dbus-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577007 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e1a3b709-858c-4bca-b52b-c96dc23d9149-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvzcz\" (UniqueName: \"kubernetes.io/projected/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-kube-api-access-xvzcz\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4p44\" (UniqueName: \"kubernetes.io/projected/e1a3b709-858c-4bca-b52b-c96dc23d9149-kube-api-access-q4p44\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.577472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-dbus-socket\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.591352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/e1a3b709-858c-4bca-b52b-c96dc23d9149-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.602215 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbpdg\" (UniqueName: \"kubernetes.io/projected/4295503b-996b-4a20-844b-07a90de225a6-kube-api-access-kbpdg\") pod \"nmstate-metrics-9b8c8685d-wvv95\" (UID: \"4295503b-996b-4a20-844b-07a90de225a6\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.604864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4p44\" (UniqueName: \"kubernetes.io/projected/e1a3b709-858c-4bca-b52b-c96dc23d9149-kube-api-access-q4p44\") pod \"nmstate-webhook-5f558f5558-qld4w\" (UID: \"e1a3b709-858c-4bca-b52b-c96dc23d9149\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.615927 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvzcz\" (UniqueName: \"kubernetes.io/projected/d5c9a572-635b-4ecc-a2a4-c7e459d6d510-kube-api-access-xvzcz\") pod \"nmstate-handler-b2697\" (UID: \"d5c9a572-635b-4ecc-a2a4-c7e459d6d510\") " pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.635073 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77d465584b-7dwm5"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.636345 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.658594 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d465584b-7dwm5"] Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.678642 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.678750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6wq\" (UniqueName: \"kubernetes.io/projected/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-kube-api-access-7s6wq\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.678774 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.679539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.681532 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.697608 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6wq\" (UniqueName: \"kubernetes.io/projected/c7ef6baa-3c87-44a8-91d2-bcfbc0696396-kube-api-access-7s6wq\") pod \"nmstate-console-plugin-86f58fcf4-k8mcs\" (UID: \"c7ef6baa-3c87-44a8-91d2-bcfbc0696396\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.748845 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.779671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.779980 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-oauth-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780127 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-trusted-ca-bundle\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kh6z6\" (UniqueName: \"kubernetes.io/projected/7d4e30e7-0446-4370-bfad-e2824747e0fe-kube-api-access-kh6z6\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780348 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-oauth-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.780621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-service-ca\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-service-ca\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-oauth-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-trusted-ca-bundle\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882348 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kh6z6\" (UniqueName: \"kubernetes.io/projected/7d4e30e7-0446-4370-bfad-e2824747e0fe-kube-api-access-kh6z6\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.882372 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-oauth-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.884941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-oauth-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.885422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.885422 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-trusted-ca-bundle\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.885544 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/7d4e30e7-0446-4370-bfad-e2824747e0fe-service-ca\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.888604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-serving-cert\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.898309 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/7d4e30e7-0446-4370-bfad-e2824747e0fe-console-oauth-config\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.905950 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kh6z6\" (UniqueName: \"kubernetes.io/projected/7d4e30e7-0446-4370-bfad-e2824747e0fe-kube-api-access-kh6z6\") pod \"console-77d465584b-7dwm5\" (UID: \"7d4e30e7-0446-4370-bfad-e2824747e0fe\") " pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:42 crc kubenswrapper[4790]: I0313 20:41:42.963581 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.161779 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs"] Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.242658 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d465584b-7dwm5"] Mar 13 20:41:43 crc kubenswrapper[4790]: W0313 20:41:43.247364 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4e30e7_0446_4370_bfad_e2824747e0fe.slice/crio-ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1 WatchSource:0}: Error finding container ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1: Status 404 returned error can't find the container with id ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1 Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.492237 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" event={"ID":"c7ef6baa-3c87-44a8-91d2-bcfbc0696396","Type":"ContainerStarted","Data":"846d6731c2416f6ca3400ce228a4640bf9f27862d957edf9e4d6432423abc67f"} Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.494224 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d465584b-7dwm5" event={"ID":"7d4e30e7-0446-4370-bfad-e2824747e0fe","Type":"ContainerStarted","Data":"895ba00e796bfdae6263f67d5d233af9b5adf62ff51fe803cc5bb3ef2ca47f23"} Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.494265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d465584b-7dwm5" event={"ID":"7d4e30e7-0446-4370-bfad-e2824747e0fe","Type":"ContainerStarted","Data":"ef7e8518a7b219f8729d4c62b131d1d8fd423d3128ff7ad868a5b281775b62e1"} Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.625279 4790 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.625350 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.674241 4790 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.674321 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.690303 4790 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-nmstate/nmstate-handler-b2697" secret="" err="failed to sync secret cache: timed out waiting for the condition" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.690415 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.722292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-dbjg2" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.852078 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77d465584b-7dwm5" podStartSLOduration=1.852060071 podStartE2EDuration="1.852060071s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:41:43.524534101 +0000 UTC m=+834.545650012" watchObservedRunningTime="2026-03-13 20:41:43.852060071 +0000 UTC m=+834.873175962" Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.854540 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95"] Mar 13 20:41:43 crc kubenswrapper[4790]: W0313 20:41:43.864554 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4295503b_996b_4a20_844b_07a90de225a6.slice/crio-fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6 WatchSource:0}: Error finding container fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6: Status 404 returned error can't find the container with id fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6 Mar 13 20:41:43 crc kubenswrapper[4790]: I0313 20:41:43.891411 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-qld4w"] Mar 13 20:41:43 crc kubenswrapper[4790]: W0313 20:41:43.897875 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a3b709_858c_4bca_b52b_c96dc23d9149.slice/crio-2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32 WatchSource:0}: Error finding container 2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32: Status 404 returned error can't find the container with id 2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32 Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.015593 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.015648 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.015688 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.016059 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.016424 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c" gracePeriod=600 Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514365 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c" exitCode=0 Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514800 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.514838 4790 scope.go:117] "RemoveContainer" containerID="876ea65d0ee844d8eca512c0665da98289a1647386d506ab2af3d32c73dd69b4" Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.515812 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b2697" event={"ID":"d5c9a572-635b-4ecc-a2a4-c7e459d6d510","Type":"ContainerStarted","Data":"ddc80cbcf43c4c96d31a8e3cc04162581c56b8a391408fde03ae1a1481dd63d9"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.518165 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" event={"ID":"e1a3b709-858c-4bca-b52b-c96dc23d9149","Type":"ContainerStarted","Data":"2873a133fccdf3da6a8a39f9a35dd13450d9f11bf2904a9ab76b350e4d4e6c32"} Mar 13 20:41:44 crc kubenswrapper[4790]: I0313 20:41:44.519323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" event={"ID":"4295503b-996b-4a20-844b-07a90de225a6","Type":"ContainerStarted","Data":"fba28eeeb0bd2aa0b49a53041177030187ec9ef0ebb6a5a671e6dc28d39900b6"} Mar 13 20:41:46 crc kubenswrapper[4790]: I0313 20:41:46.535607 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" event={"ID":"c7ef6baa-3c87-44a8-91d2-bcfbc0696396","Type":"ContainerStarted","Data":"b4cb991ef4d053abb965c3d016877324844701636042a2b010228ec59cbc5e5f"} Mar 13 20:41:46 crc kubenswrapper[4790]: I0313 20:41:46.562122 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-k8mcs" podStartSLOduration=2.177108334 podStartE2EDuration="4.562101622s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.169478351 +0000 UTC m=+834.190594242" lastFinishedPulling="2026-03-13 20:41:45.554471639 +0000 UTC m=+836.575587530" observedRunningTime="2026-03-13 20:41:46.558080795 +0000 UTC m=+837.579196686" watchObservedRunningTime="2026-03-13 20:41:46.562101622 +0000 UTC m=+837.583217513" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.542839 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-b2697" event={"ID":"d5c9a572-635b-4ecc-a2a4-c7e459d6d510","Type":"ContainerStarted","Data":"ea4d76800a4baf79fd39c71f6201900141eeffb7999edd66a020107e37307343"} Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.543966 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.545336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" event={"ID":"e1a3b709-858c-4bca-b52b-c96dc23d9149","Type":"ContainerStarted","Data":"152fe3720b2999f06b7bcae5f9e4b3a19918cc4161b995157a04dba1b462b246"} Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.545431 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.546828 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" event={"ID":"4295503b-996b-4a20-844b-07a90de225a6","Type":"ContainerStarted","Data":"3490528d1ce2cf70c747456b42a262d605def2c49f3747afb1ee75fecbe7aa70"} Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.557239 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-b2697" podStartSLOduration=2.860254338 podStartE2EDuration="5.557219788s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.720225249 +0000 UTC m=+834.741341140" lastFinishedPulling="2026-03-13 20:41:46.417190699 +0000 UTC m=+837.438306590" observedRunningTime="2026-03-13 20:41:47.556665254 +0000 UTC m=+838.577781155" watchObservedRunningTime="2026-03-13 20:41:47.557219788 +0000 UTC m=+838.578335679" Mar 13 20:41:47 crc kubenswrapper[4790]: I0313 20:41:47.577383 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" podStartSLOduration=3.077947987 podStartE2EDuration="5.57735156s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.899830084 +0000 UTC m=+834.920945975" lastFinishedPulling="2026-03-13 20:41:46.399233657 +0000 UTC m=+837.420349548" observedRunningTime="2026-03-13 20:41:47.572127179 +0000 UTC m=+838.593243080" watchObservedRunningTime="2026-03-13 20:41:47.57735156 +0000 UTC m=+838.598467481" Mar 13 20:41:49 crc kubenswrapper[4790]: I0313 20:41:49.558912 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" event={"ID":"4295503b-996b-4a20-844b-07a90de225a6","Type":"ContainerStarted","Data":"8302713ee76c52b13528f8b8de7c7ab9f67e43244468b5763c009d7db89fa3a5"} Mar 13 20:41:49 crc kubenswrapper[4790]: I0313 20:41:49.608312 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-wvv95" podStartSLOduration=2.213288966 podStartE2EDuration="7.608216323s" podCreationTimestamp="2026-03-13 20:41:42 +0000 UTC" firstStartedPulling="2026-03-13 20:41:43.866083088 +0000 UTC m=+834.887198979" lastFinishedPulling="2026-03-13 20:41:49.261010435 +0000 UTC m=+840.282126336" observedRunningTime="2026-03-13 20:41:49.583070147 +0000 UTC m=+840.604186088" watchObservedRunningTime="2026-03-13 20:41:49.608216323 +0000 UTC m=+840.629332264" Mar 13 20:41:52 crc kubenswrapper[4790]: I0313 20:41:52.964442 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:52 crc kubenswrapper[4790]: I0313 20:41:52.964717 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:52 crc kubenswrapper[4790]: I0313 20:41:52.971659 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:53 crc kubenswrapper[4790]: I0313 20:41:53.595608 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77d465584b-7dwm5" Mar 13 20:41:53 crc kubenswrapper[4790]: I0313 20:41:53.656951 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:41:53 crc kubenswrapper[4790]: I0313 20:41:53.727630 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-b2697" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.141548 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.143228 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.146338 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.146975 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.147787 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.153512 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.235605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"auto-csr-approver-29557242-lp8qf\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.337113 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"auto-csr-approver-29557242-lp8qf\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.357830 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"auto-csr-approver-29557242-lp8qf\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.499521 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:00 crc kubenswrapper[4790]: I0313 20:42:00.941708 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:42:00 crc kubenswrapper[4790]: W0313 20:42:00.952684 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6027d153_5f8e_4bb1_8275_9a8df8c533f2.slice/crio-ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89 WatchSource:0}: Error finding container ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89: Status 404 returned error can't find the container with id ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89 Mar 13 20:42:01 crc kubenswrapper[4790]: I0313 20:42:01.648266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" event={"ID":"6027d153-5f8e-4bb1-8275-9a8df8c533f2","Type":"ContainerStarted","Data":"ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89"} Mar 13 20:42:02 crc kubenswrapper[4790]: I0313 20:42:02.656591 4790 generic.go:334] "Generic (PLEG): container finished" podID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerID="31ce3becbe5f9fc73efb71d7c9c70a67bb2549c4e27e76481e3678501a4317cf" exitCode=0 Mar 13 20:42:02 crc kubenswrapper[4790]: I0313 20:42:02.656716 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" event={"ID":"6027d153-5f8e-4bb1-8275-9a8df8c533f2","Type":"ContainerDied","Data":"31ce3becbe5f9fc73efb71d7c9c70a67bb2549c4e27e76481e3678501a4317cf"} Mar 13 20:42:03 crc kubenswrapper[4790]: I0313 20:42:03.682740 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-qld4w" Mar 13 20:42:03 crc kubenswrapper[4790]: I0313 20:42:03.919727 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.092153 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") pod \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\" (UID: \"6027d153-5f8e-4bb1-8275-9a8df8c533f2\") " Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.101434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4" (OuterVolumeSpecName: "kube-api-access-g2sz4") pod "6027d153-5f8e-4bb1-8275-9a8df8c533f2" (UID: "6027d153-5f8e-4bb1-8275-9a8df8c533f2"). InnerVolumeSpecName "kube-api-access-g2sz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.194353 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2sz4\" (UniqueName: \"kubernetes.io/projected/6027d153-5f8e-4bb1-8275-9a8df8c533f2-kube-api-access-g2sz4\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.674611 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" event={"ID":"6027d153-5f8e-4bb1-8275-9a8df8c533f2","Type":"ContainerDied","Data":"ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89"} Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.674666 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea830fbd6719c35b739a9a7f305932cbf5ba79466ea92c4d1f475ccdcebafa89" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.674672 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557242-lp8qf" Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.968167 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:42:04 crc kubenswrapper[4790]: I0313 20:42:04.974064 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557236-tczbl"] Mar 13 20:42:05 crc kubenswrapper[4790]: I0313 20:42:05.669123 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43b65fb5-f36b-4fae-ba13-03b5c81d1639" path="/var/lib/kubelet/pods/43b65fb5-f36b-4fae-ba13-03b5c81d1639/volumes" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.153015 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px"] Mar 13 20:42:18 crc kubenswrapper[4790]: E0313 20:42:18.153800 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerName="oc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.153816 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerName="oc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.153974 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" containerName="oc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.155060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.160497 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px"] Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.162546 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.315320 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.315721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.315750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.416810 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.416864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.416917 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.417482 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.417485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.437839 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.478898 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.707242 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-q5j7f" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" containerID="cri-o://40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" gracePeriod=15 Mar 13 20:42:18 crc kubenswrapper[4790]: I0313 20:42:18.860361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px"] Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.534324 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q5j7f_d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c/console/0.log" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.534408 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631049 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631337 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631409 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631476 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") pod \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\" (UID: \"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c\") " Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.631978 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.632038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config" (OuterVolumeSpecName: "console-config") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.632453 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.632455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca" (OuterVolumeSpecName: "service-ca") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.637354 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v" (OuterVolumeSpecName: "kube-api-access-chx4v") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "kube-api-access-chx4v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.637691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.638069 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" (UID: "d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732622 4790 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732667 4790 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-service-ca\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732679 4790 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732691 4790 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732705 4790 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732715 4790 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.732726 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chx4v\" (UniqueName: \"kubernetes.io/projected/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c-kube-api-access-chx4v\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.766854 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-q5j7f_d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c/console/0.log" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.766934 4790 generic.go:334] "Generic (PLEG): container finished" podID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" exitCode=2 Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerDied","Data":"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767023 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-q5j7f" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767060 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-q5j7f" event={"ID":"d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c","Type":"ContainerDied","Data":"af91b2c2002cfba8d95ebe9f9e0aa50107b9d61f68613dde04ff9ae4ab302650"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.767087 4790 scope.go:117] "RemoveContainer" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.768969 4790 generic.go:334] "Generic (PLEG): container finished" podID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerID="73d77ad67ac4d15b04010b038b87d30e7703e7f28501c37a118699adcf6e336f" exitCode=0 Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.769051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"73d77ad67ac4d15b04010b038b87d30e7703e7f28501c37a118699adcf6e336f"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.769119 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerStarted","Data":"b434b30cdb21943ca53f18eaf1729db4fddc700a45eee3938607a5e3f003edd9"} Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.782658 4790 scope.go:117] "RemoveContainer" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" Mar 13 20:42:19 crc kubenswrapper[4790]: E0313 20:42:19.783125 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80\": container with ID starting with 40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80 not found: ID does not exist" containerID="40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.783161 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80"} err="failed to get container status \"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80\": rpc error: code = NotFound desc = could not find container \"40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80\": container with ID starting with 40e979965860a7fb028ab6266a5903890c3d017367e215cd659e149443363f80 not found: ID does not exist" Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.813413 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:42:19 crc kubenswrapper[4790]: I0313 20:42:19.817861 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-q5j7f"] Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.509354 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:20 crc kubenswrapper[4790]: E0313 20:42:20.509759 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.509780 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.509996 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" containerName="console" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.511335 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.515541 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.659308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.659401 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.659433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.760661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.760716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.760739 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.761507 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.761548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.781024 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"redhat-operators-fwvj9\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:20 crc kubenswrapper[4790]: I0313 20:42:20.883060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.096793 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.668422 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c" path="/var/lib/kubelet/pods/d65f2eb2-dccc-4ca5-b00d-23a0bdbd0a9c/volumes" Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.782289 4790 generic.go:334] "Generic (PLEG): container finished" podID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerID="30a13725d5a0929ecab855711341517cfdbcf9f6459a5c37ea3088910ca64874" exitCode=0 Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.782360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"30a13725d5a0929ecab855711341517cfdbcf9f6459a5c37ea3088910ca64874"} Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.783779 4790 generic.go:334] "Generic (PLEG): container finished" podID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" exitCode=0 Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.783826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16"} Mar 13 20:42:21 crc kubenswrapper[4790]: I0313 20:42:21.783854 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerStarted","Data":"17a8fc74803e09cccf53786141283b651260e0e7c4aaf11d9d5e161783ce7bac"} Mar 13 20:42:22 crc kubenswrapper[4790]: I0313 20:42:22.791984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerStarted","Data":"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e"} Mar 13 20:42:22 crc kubenswrapper[4790]: I0313 20:42:22.794207 4790 generic.go:334] "Generic (PLEG): container finished" podID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerID="8d8bbe3287546ed2b9b806b01bb8d444399ce245956ee3b45cb06c98793275c8" exitCode=0 Mar 13 20:42:22 crc kubenswrapper[4790]: I0313 20:42:22.794272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"8d8bbe3287546ed2b9b806b01bb8d444399ce245956ee3b45cb06c98793275c8"} Mar 13 20:42:23 crc kubenswrapper[4790]: I0313 20:42:23.813191 4790 generic.go:334] "Generic (PLEG): container finished" podID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" exitCode=0 Mar 13 20:42:23 crc kubenswrapper[4790]: I0313 20:42:23.813317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e"} Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.103260 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.303712 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") pod \"6940903a-9dc5-4001-bc87-9de2bdce9e52\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.304920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") pod \"6940903a-9dc5-4001-bc87-9de2bdce9e52\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.305009 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") pod \"6940903a-9dc5-4001-bc87-9de2bdce9e52\" (UID: \"6940903a-9dc5-4001-bc87-9de2bdce9e52\") " Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.306028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle" (OuterVolumeSpecName: "bundle") pod "6940903a-9dc5-4001-bc87-9de2bdce9e52" (UID: "6940903a-9dc5-4001-bc87-9de2bdce9e52"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.311055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst" (OuterVolumeSpecName: "kube-api-access-mtwst") pod "6940903a-9dc5-4001-bc87-9de2bdce9e52" (UID: "6940903a-9dc5-4001-bc87-9de2bdce9e52"). InnerVolumeSpecName "kube-api-access-mtwst". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.321857 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util" (OuterVolumeSpecName: "util") pod "6940903a-9dc5-4001-bc87-9de2bdce9e52" (UID: "6940903a-9dc5-4001-bc87-9de2bdce9e52"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.406775 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-util\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.406827 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtwst\" (UniqueName: \"kubernetes.io/projected/6940903a-9dc5-4001-bc87-9de2bdce9e52-kube-api-access-mtwst\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.406845 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6940903a-9dc5-4001-bc87-9de2bdce9e52-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.821715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" event={"ID":"6940903a-9dc5-4001-bc87-9de2bdce9e52","Type":"ContainerDied","Data":"b434b30cdb21943ca53f18eaf1729db4fddc700a45eee3938607a5e3f003edd9"} Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.822025 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b434b30cdb21943ca53f18eaf1729db4fddc700a45eee3938607a5e3f003edd9" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.821807 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px" Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.823629 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerStarted","Data":"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156"} Mar 13 20:42:24 crc kubenswrapper[4790]: I0313 20:42:24.841608 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fwvj9" podStartSLOduration=2.242014551 podStartE2EDuration="4.841579993s" podCreationTimestamp="2026-03-13 20:42:20 +0000 UTC" firstStartedPulling="2026-03-13 20:42:21.785117607 +0000 UTC m=+872.806233508" lastFinishedPulling="2026-03-13 20:42:24.384683059 +0000 UTC m=+875.405798950" observedRunningTime="2026-03-13 20:42:24.840201245 +0000 UTC m=+875.861317146" watchObservedRunningTime="2026-03-13 20:42:24.841579993 +0000 UTC m=+875.862695884" Mar 13 20:42:28 crc kubenswrapper[4790]: I0313 20:42:28.774130 4790 scope.go:117] "RemoveContainer" containerID="51921e4e629fa9d413e53a9a5c93f032ad474743b6e67b583c5b1e6927de7258" Mar 13 20:42:30 crc kubenswrapper[4790]: I0313 20:42:30.884654 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:30 crc kubenswrapper[4790]: I0313 20:42:30.885146 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:31 crc kubenswrapper[4790]: I0313 20:42:31.937568 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fwvj9" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" probeResult="failure" output=< Mar 13 20:42:31 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:42:31 crc kubenswrapper[4790]: > Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210564 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54"] Mar 13 20:42:35 crc kubenswrapper[4790]: E0313 20:42:35.210792 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="util" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210803 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="util" Mar 13 20:42:35 crc kubenswrapper[4790]: E0313 20:42:35.210821 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="pull" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210827 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="pull" Mar 13 20:42:35 crc kubenswrapper[4790]: E0313 20:42:35.210837 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="extract" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210843 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="extract" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.210950 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6940903a-9dc5-4001-bc87-9de2bdce9e52" containerName="extract" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.211339 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213686 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213708 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.213769 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xh6l6" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.215298 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.227684 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.344739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-apiservice-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.344817 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhg54\" (UniqueName: \"kubernetes.io/projected/da23093d-500f-43f4-805a-b4a252e40940-kube-api-access-lhg54\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.344848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-webhook-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.446090 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-apiservice-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.446146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhg54\" (UniqueName: \"kubernetes.io/projected/da23093d-500f-43f4-805a-b4a252e40940-kube-api-access-lhg54\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.446173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-webhook-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.451887 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-webhook-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.464179 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhg54\" (UniqueName: \"kubernetes.io/projected/da23093d-500f-43f4-805a-b4a252e40940-kube-api-access-lhg54\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.468453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/da23093d-500f-43f4-805a-b4a252e40940-apiservice-cert\") pod \"metallb-operator-controller-manager-6c885c8d8c-fcv54\" (UID: \"da23093d-500f-43f4-805a-b4a252e40940\") " pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.526421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.536365 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.537174 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.544755 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-6jfg4" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.545001 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.547475 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.565196 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.648022 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-apiservice-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.648071 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw7kq\" (UniqueName: \"kubernetes.io/projected/783be831-b522-42a0-9cbe-f234ed3a027c-kube-api-access-tw7kq\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.648157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-webhook-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.749428 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-apiservice-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.749475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw7kq\" (UniqueName: \"kubernetes.io/projected/783be831-b522-42a0-9cbe-f234ed3a027c-kube-api-access-tw7kq\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.749538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-webhook-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.754234 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-webhook-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.754623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/783be831-b522-42a0-9cbe-f234ed3a027c-apiservice-cert\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.768851 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw7kq\" (UniqueName: \"kubernetes.io/projected/783be831-b522-42a0-9cbe-f234ed3a027c-kube-api-access-tw7kq\") pod \"metallb-operator-webhook-server-76c9b767d4-c6mq2\" (UID: \"783be831-b522-42a0-9cbe-f234ed3a027c\") " pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.774184 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54"] Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.890178 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" event={"ID":"da23093d-500f-43f4-805a-b4a252e40940","Type":"ContainerStarted","Data":"3f5fc4c636aafa39e2b75bfe1d26cc0ce009e8e3fa9f2c626da8ffb85d7cfb70"} Mar 13 20:42:35 crc kubenswrapper[4790]: I0313 20:42:35.905137 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:36 crc kubenswrapper[4790]: I0313 20:42:36.329149 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2"] Mar 13 20:42:36 crc kubenswrapper[4790]: W0313 20:42:36.332865 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod783be831_b522_42a0_9cbe_f234ed3a027c.slice/crio-cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350 WatchSource:0}: Error finding container cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350: Status 404 returned error can't find the container with id cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350 Mar 13 20:42:36 crc kubenswrapper[4790]: I0313 20:42:36.896303 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" event={"ID":"783be831-b522-42a0-9cbe-f234ed3a027c","Type":"ContainerStarted","Data":"cb3e7b3fbb5f35b7f0dea46a4260b3e53f91db6f09af72ca3c4e0995390a6350"} Mar 13 20:42:39 crc kubenswrapper[4790]: I0313 20:42:39.917142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" event={"ID":"da23093d-500f-43f4-805a-b4a252e40940","Type":"ContainerStarted","Data":"1ae63466639d55a8b537202415dad25349ac714c24132420120fa23ce9544150"} Mar 13 20:42:39 crc kubenswrapper[4790]: I0313 20:42:39.917874 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:42:39 crc kubenswrapper[4790]: I0313 20:42:39.943501 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" podStartSLOduration=1.951263334 podStartE2EDuration="4.943422428s" podCreationTimestamp="2026-03-13 20:42:35 +0000 UTC" firstStartedPulling="2026-03-13 20:42:35.785154721 +0000 UTC m=+886.806270622" lastFinishedPulling="2026-03-13 20:42:38.777313825 +0000 UTC m=+889.798429716" observedRunningTime="2026-03-13 20:42:39.937027165 +0000 UTC m=+890.958143066" watchObservedRunningTime="2026-03-13 20:42:39.943422428 +0000 UTC m=+890.964538319" Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.923597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" event={"ID":"783be831-b522-42a0-9cbe-f234ed3a027c","Type":"ContainerStarted","Data":"7bc869e32accf2119f4809ada290969661cc46360e4f1d973aa3e7018afa894f"} Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.939279 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.941483 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" podStartSLOduration=1.957889454 podStartE2EDuration="5.941470728s" podCreationTimestamp="2026-03-13 20:42:35 +0000 UTC" firstStartedPulling="2026-03-13 20:42:36.335809925 +0000 UTC m=+887.356925816" lastFinishedPulling="2026-03-13 20:42:40.319391199 +0000 UTC m=+891.340507090" observedRunningTime="2026-03-13 20:42:40.940154122 +0000 UTC m=+891.961270013" watchObservedRunningTime="2026-03-13 20:42:40.941470728 +0000 UTC m=+891.962586619" Mar 13 20:42:40 crc kubenswrapper[4790]: I0313 20:42:40.988604 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:41 crc kubenswrapper[4790]: I0313 20:42:41.930711 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:42:42 crc kubenswrapper[4790]: I0313 20:42:42.686958 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:42 crc kubenswrapper[4790]: I0313 20:42:42.935851 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fwvj9" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" containerID="cri-o://e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" gracePeriod=2 Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.404926 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.575899 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") pod \"63d87b2a-7e33-4196-a549-c618ac863a8b\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.576053 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") pod \"63d87b2a-7e33-4196-a549-c618ac863a8b\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.576123 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") pod \"63d87b2a-7e33-4196-a549-c618ac863a8b\" (UID: \"63d87b2a-7e33-4196-a549-c618ac863a8b\") " Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.576951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities" (OuterVolumeSpecName: "utilities") pod "63d87b2a-7e33-4196-a549-c618ac863a8b" (UID: "63d87b2a-7e33-4196-a549-c618ac863a8b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.581434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2" (OuterVolumeSpecName: "kube-api-access-trdh2") pod "63d87b2a-7e33-4196-a549-c618ac863a8b" (UID: "63d87b2a-7e33-4196-a549-c618ac863a8b"). InnerVolumeSpecName "kube-api-access-trdh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.677456 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-trdh2\" (UniqueName: \"kubernetes.io/projected/63d87b2a-7e33-4196-a549-c618ac863a8b-kube-api-access-trdh2\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.677520 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.722081 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "63d87b2a-7e33-4196-a549-c618ac863a8b" (UID: "63d87b2a-7e33-4196-a549-c618ac863a8b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.778690 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/63d87b2a-7e33-4196-a549-c618ac863a8b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945161 4790 generic.go:334] "Generic (PLEG): container finished" podID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" exitCode=0 Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945204 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156"} Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fwvj9" event={"ID":"63d87b2a-7e33-4196-a549-c618ac863a8b","Type":"ContainerDied","Data":"17a8fc74803e09cccf53786141283b651260e0e7c4aaf11d9d5e161783ce7bac"} Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945249 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fwvj9" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.945252 4790 scope.go:117] "RemoveContainer" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.972761 4790 scope.go:117] "RemoveContainer" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.973965 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.978704 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fwvj9"] Mar 13 20:42:43 crc kubenswrapper[4790]: I0313 20:42:43.998718 4790 scope.go:117] "RemoveContainer" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.017628 4790 scope.go:117] "RemoveContainer" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" Mar 13 20:42:44 crc kubenswrapper[4790]: E0313 20:42:44.018035 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156\": container with ID starting with e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156 not found: ID does not exist" containerID="e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018068 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156"} err="failed to get container status \"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156\": rpc error: code = NotFound desc = could not find container \"e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156\": container with ID starting with e09cddd877a58458967ba3a234518016a9a7687ed18e5176ebbcf0133f804156 not found: ID does not exist" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018103 4790 scope.go:117] "RemoveContainer" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" Mar 13 20:42:44 crc kubenswrapper[4790]: E0313 20:42:44.018407 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e\": container with ID starting with 59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e not found: ID does not exist" containerID="59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018433 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e"} err="failed to get container status \"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e\": rpc error: code = NotFound desc = could not find container \"59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e\": container with ID starting with 59da1cd10f2757655920c711addf66d5866e4a93d339f7c5a99dd41800fc582e not found: ID does not exist" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018445 4790 scope.go:117] "RemoveContainer" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" Mar 13 20:42:44 crc kubenswrapper[4790]: E0313 20:42:44.018642 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16\": container with ID starting with e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16 not found: ID does not exist" containerID="e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16" Mar 13 20:42:44 crc kubenswrapper[4790]: I0313 20:42:44.018663 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16"} err="failed to get container status \"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16\": rpc error: code = NotFound desc = could not find container \"e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16\": container with ID starting with e5bae04cdffb8c83701a42fc9db405a6dca31c8fa75b143d095f27d9c8d79f16 not found: ID does not exist" Mar 13 20:42:45 crc kubenswrapper[4790]: I0313 20:42:45.666472 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" path="/var/lib/kubelet/pods/63d87b2a-7e33-4196-a549-c618ac863a8b/volumes" Mar 13 20:42:55 crc kubenswrapper[4790]: I0313 20:42:55.909781 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-76c9b767d4-c6mq2" Mar 13 20:43:15 crc kubenswrapper[4790]: I0313 20:43:15.529343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6c885c8d8c-fcv54" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.228747 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-r97zs"] Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.229054 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-content" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229070 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-content" Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.229082 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-utilities" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229089 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="extract-utilities" Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.229104 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229112 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.229234 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="63d87b2a-7e33-4196-a549-c618ac863a8b" containerName="registry-server" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.231361 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.232908 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.234212 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.235036 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.235096 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-skcdc" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.235221 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.237245 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.244751 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.321868 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5tk2m"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.322914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325069 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-6v8lw" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325149 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325184 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.325224 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.342647 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-czl9k"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.343559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.351656 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.358814 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-czl9k"] Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.413946 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-sockets\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pshkb\" (UniqueName: \"kubernetes.io/projected/3ab7e856-a311-4e29-aabf-adaa27363613-kube-api-access-pshkb\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414263 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-reloader\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab7e856-a311-4e29-aabf-adaa27363613-metrics-certs\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-metrics\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414339 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/472cc73a-53fe-4d7c-aec8-b2154023ba90-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3ab7e856-a311-4e29-aabf-adaa27363613-frr-startup\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-conf\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.414434 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72d28\" (UniqueName: \"kubernetes.io/projected/472cc73a-53fe-4d7c-aec8-b2154023ba90-kube-api-access-72d28\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515417 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72d28\" (UniqueName: \"kubernetes.io/projected/472cc73a-53fe-4d7c-aec8-b2154023ba90-kube-api-access-72d28\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515462 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-conf\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515497 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-sockets\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pshkb\" (UniqueName: \"kubernetes.io/projected/3ab7e856-a311-4e29-aabf-adaa27363613-kube-api-access-pshkb\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a3729738-ead5-47e0-95de-04dc39fb0516-metallb-excludel2\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515577 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7vz4\" (UniqueName: \"kubernetes.io/projected/a3729738-ead5-47e0-95de-04dc39fb0516-kube-api-access-h7vz4\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6zgg\" (UniqueName: \"kubernetes.io/projected/d5ef8654-e56f-454b-9fae-0753a30dab0f-kube-api-access-x6zgg\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-metrics-certs\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515636 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-reloader\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515650 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab7e856-a311-4e29-aabf-adaa27363613-metrics-certs\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-metrics-certs\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515708 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-cert\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515734 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/472cc73a-53fe-4d7c-aec8-b2154023ba90-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.515747 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-metrics\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-sockets\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516114 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-frr-conf\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516229 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-reloader\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.516826 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3ab7e856-a311-4e29-aabf-adaa27363613-frr-startup\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.517022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3ab7e856-a311-4e29-aabf-adaa27363613-metrics\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.517559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3ab7e856-a311-4e29-aabf-adaa27363613-frr-startup\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.525952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3ab7e856-a311-4e29-aabf-adaa27363613-metrics-certs\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.526460 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/472cc73a-53fe-4d7c-aec8-b2154023ba90-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.534563 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pshkb\" (UniqueName: \"kubernetes.io/projected/3ab7e856-a311-4e29-aabf-adaa27363613-kube-api-access-pshkb\") pod \"frr-k8s-r97zs\" (UID: \"3ab7e856-a311-4e29-aabf-adaa27363613\") " pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.538307 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72d28\" (UniqueName: \"kubernetes.io/projected/472cc73a-53fe-4d7c-aec8-b2154023ba90-kube-api-access-72d28\") pod \"frr-k8s-webhook-server-bcc4b6f68-8ckr8\" (UID: \"472cc73a-53fe-4d7c-aec8-b2154023ba90\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.564395 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.573695 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.617982 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a3729738-ead5-47e0-95de-04dc39fb0516-metallb-excludel2\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618025 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7vz4\" (UniqueName: \"kubernetes.io/projected/a3729738-ead5-47e0-95de-04dc39fb0516-kube-api-access-h7vz4\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618046 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6zgg\" (UniqueName: \"kubernetes.io/projected/d5ef8654-e56f-454b-9fae-0753a30dab0f-kube-api-access-x6zgg\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618065 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-metrics-certs\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-metrics-certs\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.618148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-cert\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.619022 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/a3729738-ead5-47e0-95de-04dc39fb0516-metallb-excludel2\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.620056 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 20:43:16 crc kubenswrapper[4790]: E0313 20:43:16.620190 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist podName:a3729738-ead5-47e0-95de-04dc39fb0516 nodeName:}" failed. No retries permitted until 2026-03-13 20:43:17.12016708 +0000 UTC m=+928.141282971 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist") pod "speaker-5tk2m" (UID: "a3729738-ead5-47e0-95de-04dc39fb0516") : secret "metallb-memberlist" not found Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.623441 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-metrics-certs\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.623947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-metrics-certs\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.627498 4790 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.632711 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d5ef8654-e56f-454b-9fae-0753a30dab0f-cert\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.652505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7vz4\" (UniqueName: \"kubernetes.io/projected/a3729738-ead5-47e0-95de-04dc39fb0516-kube-api-access-h7vz4\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.655097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6zgg\" (UniqueName: \"kubernetes.io/projected/d5ef8654-e56f-454b-9fae-0753a30dab0f-kube-api-access-x6zgg\") pod \"controller-7bb4cc7c98-czl9k\" (UID: \"d5ef8654-e56f-454b-9fae-0753a30dab0f\") " pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.660775 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:16 crc kubenswrapper[4790]: I0313 20:43:16.746338 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.125563 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:17 crc kubenswrapper[4790]: E0313 20:43:17.125978 4790 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 13 20:43:17 crc kubenswrapper[4790]: E0313 20:43:17.126040 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist podName:a3729738-ead5-47e0-95de-04dc39fb0516 nodeName:}" failed. No retries permitted until 2026-03-13 20:43:18.126022573 +0000 UTC m=+929.147138464 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist") pod "speaker-5tk2m" (UID: "a3729738-ead5-47e0-95de-04dc39fb0516") : secret "metallb-memberlist" not found Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.126822 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-czl9k"] Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.129819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8"] Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.276128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-czl9k" event={"ID":"d5ef8654-e56f-454b-9fae-0753a30dab0f","Type":"ContainerStarted","Data":"68fd06d4f6af3b0016e03fa0aadc5c6a3704ee615972d40e0574fd87d3734a67"} Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.276169 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-czl9k" event={"ID":"d5ef8654-e56f-454b-9fae-0753a30dab0f","Type":"ContainerStarted","Data":"ea7004bcf36dbb656400b38b97f3df09925aa43298c8c545be2b91bed7e4efd7"} Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.278058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" event={"ID":"472cc73a-53fe-4d7c-aec8-b2154023ba90","Type":"ContainerStarted","Data":"e244eb751be0e2222b0a9e2d0b566d189bd0df4161e470b0c39884b6e54be354"} Mar 13 20:43:17 crc kubenswrapper[4790]: I0313 20:43:17.279930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"a5c13d5df3d78f0f5c75b78a768ccfd97cea573967c0ed084bb8b8c745280933"} Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.140110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.146142 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/a3729738-ead5-47e0-95de-04dc39fb0516-memberlist\") pod \"speaker-5tk2m\" (UID: \"a3729738-ead5-47e0-95de-04dc39fb0516\") " pod="metallb-system/speaker-5tk2m" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.289430 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-czl9k" event={"ID":"d5ef8654-e56f-454b-9fae-0753a30dab0f","Type":"ContainerStarted","Data":"d70dc94a0fc88513769f3de09d52205171892a67339c2b41e4f7a90c537ef9d4"} Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.290640 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.310960 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-czl9k" podStartSLOduration=2.310918768 podStartE2EDuration="2.310918768s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:43:18.307098594 +0000 UTC m=+929.328214485" watchObservedRunningTime="2026-03-13 20:43:18.310918768 +0000 UTC m=+929.332034659" Mar 13 20:43:18 crc kubenswrapper[4790]: I0313 20:43:18.445348 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:18 crc kubenswrapper[4790]: W0313 20:43:18.479257 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3729738_ead5_47e0_95de_04dc39fb0516.slice/crio-b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a WatchSource:0}: Error finding container b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a: Status 404 returned error can't find the container with id b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.300865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tk2m" event={"ID":"a3729738-ead5-47e0-95de-04dc39fb0516","Type":"ContainerStarted","Data":"3ae5de1a33157acac843c2f1b7002af2f2799488b9a72e5c21b2c1d9d878eaa1"} Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.301423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tk2m" event={"ID":"a3729738-ead5-47e0-95de-04dc39fb0516","Type":"ContainerStarted","Data":"f8d2f65b8e4e5b233e46774703123181a7404d39c2265bfe084a46e8ed71b1f9"} Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.301442 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5tk2m" event={"ID":"a3729738-ead5-47e0-95de-04dc39fb0516","Type":"ContainerStarted","Data":"b5c4e8d214625e52c8d2b57e21b6ea4720f48e981fcf8670bb2d4261739e6e9a"} Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.302641 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:19 crc kubenswrapper[4790]: I0313 20:43:19.326015 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5tk2m" podStartSLOduration=3.325996646 podStartE2EDuration="3.325996646s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:43:19.316464447 +0000 UTC m=+930.337580348" watchObservedRunningTime="2026-03-13 20:43:19.325996646 +0000 UTC m=+930.347112537" Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.344763 4790 generic.go:334] "Generic (PLEG): container finished" podID="3ab7e856-a311-4e29-aabf-adaa27363613" containerID="64b6e9a811b920351f37a00f8395a4bdfed37c50e7e92c2ab9d0b43fbfb9a502" exitCode=0 Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.344826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerDied","Data":"64b6e9a811b920351f37a00f8395a4bdfed37c50e7e92c2ab9d0b43fbfb9a502"} Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.347234 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" event={"ID":"472cc73a-53fe-4d7c-aec8-b2154023ba90","Type":"ContainerStarted","Data":"46636b011c58b0a72156b97685eb290373197aaf928bdc7010bb84803ecaba6b"} Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.347513 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:24 crc kubenswrapper[4790]: I0313 20:43:24.384854 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" podStartSLOduration=1.668061799 podStartE2EDuration="8.384836357s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="2026-03-13 20:43:17.139767836 +0000 UTC m=+928.160883727" lastFinishedPulling="2026-03-13 20:43:23.856542394 +0000 UTC m=+934.877658285" observedRunningTime="2026-03-13 20:43:24.38165425 +0000 UTC m=+935.402770141" watchObservedRunningTime="2026-03-13 20:43:24.384836357 +0000 UTC m=+935.405952258" Mar 13 20:43:25 crc kubenswrapper[4790]: I0313 20:43:25.354553 4790 generic.go:334] "Generic (PLEG): container finished" podID="3ab7e856-a311-4e29-aabf-adaa27363613" containerID="0b052a2baddbeece7ca41bef76737dde35ce3c507cc3b2219e2854674ed991bd" exitCode=0 Mar 13 20:43:25 crc kubenswrapper[4790]: I0313 20:43:25.354614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerDied","Data":"0b052a2baddbeece7ca41bef76737dde35ce3c507cc3b2219e2854674ed991bd"} Mar 13 20:43:26 crc kubenswrapper[4790]: I0313 20:43:26.361900 4790 generic.go:334] "Generic (PLEG): container finished" podID="3ab7e856-a311-4e29-aabf-adaa27363613" containerID="7bf46f97328a6ee37e75f896df31fc2301c7e73214e16ae3f85cff47a0ad2a75" exitCode=0 Mar 13 20:43:26 crc kubenswrapper[4790]: I0313 20:43:26.361983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerDied","Data":"7bf46f97328a6ee37e75f896df31fc2301c7e73214e16ae3f85cff47a0ad2a75"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.376968 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"f1d31aaa0cb6d63b64969826ff66ad1625802ad0d839543120b8c6c1420816bf"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"c462c7755d333c97ba3a8fa96f732cd286e209277633fd3770b203861bb6f567"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"cdf225191adab2804534ddc3e506e7659f204d3375754b0fbbff8b6a55587198"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"3473d20eda94cf5cc80a4740762659ea126ca9510d1e872eeaea1be5650500d1"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377443 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"4921b57f72534d48dd03ace0e29e46079aa124fe14ae3c410c35da9961a8dcaa"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.377500 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-r97zs" event={"ID":"3ab7e856-a311-4e29-aabf-adaa27363613","Type":"ContainerStarted","Data":"0aba41e40c99fb6a7c39a3128a7c6b9ed7247a9e12c598623f8bfd63af710add"} Mar 13 20:43:27 crc kubenswrapper[4790]: I0313 20:43:27.398077 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-r97zs" podStartSLOduration=4.318485801 podStartE2EDuration="11.398049991s" podCreationTimestamp="2026-03-13 20:43:16 +0000 UTC" firstStartedPulling="2026-03-13 20:43:16.746068376 +0000 UTC m=+927.767184267" lastFinishedPulling="2026-03-13 20:43:23.825632526 +0000 UTC m=+934.846748457" observedRunningTime="2026-03-13 20:43:27.395776779 +0000 UTC m=+938.416892680" watchObservedRunningTime="2026-03-13 20:43:27.398049991 +0000 UTC m=+938.419165872" Mar 13 20:43:28 crc kubenswrapper[4790]: I0313 20:43:28.448879 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5tk2m" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.070716 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.072122 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.075160 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-f8kc8" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.076171 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.079548 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.081885 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.212995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"openstack-operator-index-bqc7r\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.313973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"openstack-operator-index-bqc7r\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.332303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"openstack-operator-index-bqc7r\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.399859 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.564862 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.603448 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:31 crc kubenswrapper[4790]: I0313 20:43:31.774098 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:31 crc kubenswrapper[4790]: W0313 20:43:31.781787 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92557fd1_85f4_48e5_9923_1d833bffe6d5.slice/crio-d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4 WatchSource:0}: Error finding container d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4: Status 404 returned error can't find the container with id d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4 Mar 13 20:43:32 crc kubenswrapper[4790]: I0313 20:43:32.404620 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerStarted","Data":"d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4"} Mar 13 20:43:34 crc kubenswrapper[4790]: I0313 20:43:34.452721 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.051790 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-58vcj"] Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.052833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.071324 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-58vcj"] Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.169644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l64wv\" (UniqueName: \"kubernetes.io/projected/db35ffd8-ac53-48ad-8035-53066c9df48b-kube-api-access-l64wv\") pod \"openstack-operator-index-58vcj\" (UID: \"db35ffd8-ac53-48ad-8035-53066c9df48b\") " pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.270727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l64wv\" (UniqueName: \"kubernetes.io/projected/db35ffd8-ac53-48ad-8035-53066c9df48b-kube-api-access-l64wv\") pod \"openstack-operator-index-58vcj\" (UID: \"db35ffd8-ac53-48ad-8035-53066c9df48b\") " pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.289063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l64wv\" (UniqueName: \"kubernetes.io/projected/db35ffd8-ac53-48ad-8035-53066c9df48b-kube-api-access-l64wv\") pod \"openstack-operator-index-58vcj\" (UID: \"db35ffd8-ac53-48ad-8035-53066c9df48b\") " pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.377177 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.436249 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerStarted","Data":"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4"} Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.436663 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-bqc7r" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" containerID="cri-o://42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" gracePeriod=2 Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.462868 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-bqc7r" podStartSLOduration=1.482063315 podStartE2EDuration="4.46284764s" podCreationTimestamp="2026-03-13 20:43:31 +0000 UTC" firstStartedPulling="2026-03-13 20:43:31.783629668 +0000 UTC m=+942.804745559" lastFinishedPulling="2026-03-13 20:43:34.764413993 +0000 UTC m=+945.785529884" observedRunningTime="2026-03-13 20:43:35.455015457 +0000 UTC m=+946.476131348" watchObservedRunningTime="2026-03-13 20:43:35.46284764 +0000 UTC m=+946.483963531" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.604861 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-58vcj"] Mar 13 20:43:35 crc kubenswrapper[4790]: W0313 20:43:35.609572 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb35ffd8_ac53_48ad_8035_53066c9df48b.slice/crio-e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5 WatchSource:0}: Error finding container e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5: Status 404 returned error can't find the container with id e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5 Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.770437 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.881881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") pod \"92557fd1-85f4-48e5-9923-1d833bffe6d5\" (UID: \"92557fd1-85f4-48e5-9923-1d833bffe6d5\") " Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.887432 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s" (OuterVolumeSpecName: "kube-api-access-mxq7s") pod "92557fd1-85f4-48e5-9923-1d833bffe6d5" (UID: "92557fd1-85f4-48e5-9923-1d833bffe6d5"). InnerVolumeSpecName "kube-api-access-mxq7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:43:35 crc kubenswrapper[4790]: I0313 20:43:35.983196 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxq7s\" (UniqueName: \"kubernetes.io/projected/92557fd1-85f4-48e5-9923-1d833bffe6d5-kube-api-access-mxq7s\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.446988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-58vcj" event={"ID":"db35ffd8-ac53-48ad-8035-53066c9df48b","Type":"ContainerStarted","Data":"dc77b656e09ee6c636ffbd3d7afbcaf2116a52871b4bd2d76dc5e9e500c3af2a"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.447061 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-58vcj" event={"ID":"db35ffd8-ac53-48ad-8035-53066c9df48b","Type":"ContainerStarted","Data":"e44106bc97a902c9a27a38a3f62d031bbd361bb2582f3c0b884655f0f82026b5"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.448951 4790 generic.go:334] "Generic (PLEG): container finished" podID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" exitCode=0 Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.448988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerDied","Data":"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.449032 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-bqc7r" event={"ID":"92557fd1-85f4-48e5-9923-1d833bffe6d5","Type":"ContainerDied","Data":"d1b154b866f226aa1234710b3ebe57a7012f2042be36547d641cdb380f9a21d4"} Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.449057 4790 scope.go:117] "RemoveContainer" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.449034 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-bqc7r" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.464341 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-58vcj" podStartSLOduration=1.402478769 podStartE2EDuration="1.464320198s" podCreationTimestamp="2026-03-13 20:43:35 +0000 UTC" firstStartedPulling="2026-03-13 20:43:35.612552161 +0000 UTC m=+946.633668052" lastFinishedPulling="2026-03-13 20:43:35.67439359 +0000 UTC m=+946.695509481" observedRunningTime="2026-03-13 20:43:36.462682765 +0000 UTC m=+947.483798686" watchObservedRunningTime="2026-03-13 20:43:36.464320198 +0000 UTC m=+947.485436089" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.473180 4790 scope.go:117] "RemoveContainer" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" Mar 13 20:43:36 crc kubenswrapper[4790]: E0313 20:43:36.473675 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4\": container with ID starting with 42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4 not found: ID does not exist" containerID="42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.473718 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4"} err="failed to get container status \"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4\": rpc error: code = NotFound desc = could not find container \"42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4\": container with ID starting with 42ba522c96212dbc810448309c3fddb36eade4becf47e5d6c3df49d7646f39a4 not found: ID does not exist" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.484794 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.488995 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-bqc7r"] Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.568067 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-r97zs" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.579749 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-8ckr8" Mar 13 20:43:36 crc kubenswrapper[4790]: I0313 20:43:36.665056 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-czl9k" Mar 13 20:43:37 crc kubenswrapper[4790]: I0313 20:43:37.670562 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" path="/var/lib/kubelet/pods/92557fd1-85f4-48e5-9923-1d833bffe6d5/volumes" Mar 13 20:43:44 crc kubenswrapper[4790]: I0313 20:43:44.016352 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:43:44 crc kubenswrapper[4790]: I0313 20:43:44.016978 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.378227 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.379776 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.408242 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.462503 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:45 crc kubenswrapper[4790]: E0313 20:43:45.462772 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.462791 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.462900 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="92557fd1-85f4-48e5-9923-1d833bffe6d5" containerName="registry-server" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.463709 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.473901 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.526932 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-58vcj" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.616444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.616498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.616587 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.717793 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.717843 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.717896 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.718224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.718265 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.736825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"redhat-marketplace-kgpkk\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:45 crc kubenswrapper[4790]: I0313 20:43:45.791577 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.068813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.511926 4790 generic.go:334] "Generic (PLEG): container finished" podID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" exitCode=0 Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.512038 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d"} Mar 13 20:43:46 crc kubenswrapper[4790]: I0313 20:43:46.512405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerStarted","Data":"fbe0b7416d29f07efca01c0abb7eb4bb90760cb7b1eb3da1d6d801e7fefa45f8"} Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.086519 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk"] Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.088138 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.091643 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-5h4dr" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.093132 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk"] Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.154900 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.154974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.155093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.256516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.256845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.256969 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.257340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.257530 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.278887 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.402809 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.528246 4790 generic.go:334] "Generic (PLEG): container finished" podID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" exitCode=0 Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.528323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098"} Mar 13 20:43:48 crc kubenswrapper[4790]: I0313 20:43:48.795206 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk"] Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.535646 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerID="9fd7e747c5f75aba3b14cf664cf0d79ce63a62bed0cc8cb9fe547f9eb8e037d7" exitCode=0 Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.535753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"9fd7e747c5f75aba3b14cf664cf0d79ce63a62bed0cc8cb9fe547f9eb8e037d7"} Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.536026 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerStarted","Data":"caed4dcfd5d370ca1eff47dafd8365437fd64756e69d6356a4612546b1258ad4"} Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.538658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerStarted","Data":"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e"} Mar 13 20:43:49 crc kubenswrapper[4790]: I0313 20:43:49.575863 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kgpkk" podStartSLOduration=2.051199118 podStartE2EDuration="4.575842889s" podCreationTimestamp="2026-03-13 20:43:45 +0000 UTC" firstStartedPulling="2026-03-13 20:43:46.51309058 +0000 UTC m=+957.534206471" lastFinishedPulling="2026-03-13 20:43:49.037734351 +0000 UTC m=+960.058850242" observedRunningTime="2026-03-13 20:43:49.574304207 +0000 UTC m=+960.595420108" watchObservedRunningTime="2026-03-13 20:43:49.575842889 +0000 UTC m=+960.596958780" Mar 13 20:43:50 crc kubenswrapper[4790]: I0313 20:43:50.545933 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerID="f4164fba88cd5e16f917481b98911580189954a015fd7c0ae7792fd0306fe622" exitCode=0 Mar 13 20:43:50 crc kubenswrapper[4790]: I0313 20:43:50.545999 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"f4164fba88cd5e16f917481b98911580189954a015fd7c0ae7792fd0306fe622"} Mar 13 20:43:51 crc kubenswrapper[4790]: I0313 20:43:51.558144 4790 generic.go:334] "Generic (PLEG): container finished" podID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerID="7c073805963588cd60cce2eb9cea583f73b364e5e903872206f07b6527d29cfd" exitCode=0 Mar 13 20:43:51 crc kubenswrapper[4790]: I0313 20:43:51.558190 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"7c073805963588cd60cce2eb9cea583f73b364e5e903872206f07b6527d29cfd"} Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.795001 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.814821 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") pod \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.814966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") pod \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.814991 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") pod \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\" (UID: \"4f787e63-2dda-4c6f-9c43-0b61658fed8c\") " Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.815469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle" (OuterVolumeSpecName: "bundle") pod "4f787e63-2dda-4c6f-9c43-0b61658fed8c" (UID: "4f787e63-2dda-4c6f-9c43-0b61658fed8c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.818703 4790 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.822012 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx" (OuterVolumeSpecName: "kube-api-access-ct4zx") pod "4f787e63-2dda-4c6f-9c43-0b61658fed8c" (UID: "4f787e63-2dda-4c6f-9c43-0b61658fed8c"). InnerVolumeSpecName "kube-api-access-ct4zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.830743 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util" (OuterVolumeSpecName: "util") pod "4f787e63-2dda-4c6f-9c43-0b61658fed8c" (UID: "4f787e63-2dda-4c6f-9c43-0b61658fed8c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.919979 4790 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4f787e63-2dda-4c6f-9c43-0b61658fed8c-util\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:52 crc kubenswrapper[4790]: I0313 20:43:52.920014 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ct4zx\" (UniqueName: \"kubernetes.io/projected/4f787e63-2dda-4c6f-9c43-0b61658fed8c-kube-api-access-ct4zx\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:53 crc kubenswrapper[4790]: I0313 20:43:53.577713 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" event={"ID":"4f787e63-2dda-4c6f-9c43-0b61658fed8c","Type":"ContainerDied","Data":"caed4dcfd5d370ca1eff47dafd8365437fd64756e69d6356a4612546b1258ad4"} Mar 13 20:43:53 crc kubenswrapper[4790]: I0313 20:43:53.577776 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="caed4dcfd5d370ca1eff47dafd8365437fd64756e69d6356a4612546b1258ad4" Mar 13 20:43:53 crc kubenswrapper[4790]: I0313 20:43:53.577782 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk" Mar 13 20:43:55 crc kubenswrapper[4790]: I0313 20:43:55.792364 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:55 crc kubenswrapper[4790]: I0313 20:43:55.792750 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:55 crc kubenswrapper[4790]: I0313 20:43:55.834197 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:56 crc kubenswrapper[4790]: I0313 20:43:56.634765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.985907 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t"] Mar 13 20:43:57 crc kubenswrapper[4790]: E0313 20:43:57.986751 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="util" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.986771 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="util" Mar 13 20:43:57 crc kubenswrapper[4790]: E0313 20:43:57.986808 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="pull" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.986817 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="pull" Mar 13 20:43:57 crc kubenswrapper[4790]: E0313 20:43:57.986837 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="extract" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.986847 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="extract" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.987094 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f787e63-2dda-4c6f-9c43-0b61658fed8c" containerName="extract" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.988060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:57 crc kubenswrapper[4790]: I0313 20:43:57.992847 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-w9x6j" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.003558 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t"] Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.083775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nppw5\" (UniqueName: \"kubernetes.io/projected/87b8083b-23ab-4733-a7ac-85bf1e565551-kube-api-access-nppw5\") pod \"openstack-operator-controller-init-5c46d6fb64-bj72t\" (UID: \"87b8083b-23ab-4733-a7ac-85bf1e565551\") " pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.184144 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nppw5\" (UniqueName: \"kubernetes.io/projected/87b8083b-23ab-4733-a7ac-85bf1e565551-kube-api-access-nppw5\") pod \"openstack-operator-controller-init-5c46d6fb64-bj72t\" (UID: \"87b8083b-23ab-4733-a7ac-85bf1e565551\") " pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.203795 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nppw5\" (UniqueName: \"kubernetes.io/projected/87b8083b-23ab-4733-a7ac-85bf1e565551-kube-api-access-nppw5\") pod \"openstack-operator-controller-init-5c46d6fb64-bj72t\" (UID: \"87b8083b-23ab-4733-a7ac-85bf1e565551\") " pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.250474 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.306653 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.607031 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kgpkk" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" containerID="cri-o://340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" gracePeriod=2 Mar 13 20:43:58 crc kubenswrapper[4790]: I0313 20:43:58.721083 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t"] Mar 13 20:43:58 crc kubenswrapper[4790]: W0313 20:43:58.732282 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87b8083b_23ab_4733_a7ac_85bf1e565551.slice/crio-7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a WatchSource:0}: Error finding container 7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a: Status 404 returned error can't find the container with id 7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.517901 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.613597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" event={"ID":"87b8083b-23ab-4733-a7ac-85bf1e565551","Type":"ContainerStarted","Data":"7968752759ea70234f074f0065fd39045a76f94cadf62da4a11131fc95ef3c1a"} Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.615912 4790 generic.go:334] "Generic (PLEG): container finished" podID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" exitCode=0 Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.615953 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kgpkk" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.615983 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e"} Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.616037 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kgpkk" event={"ID":"9d2e8f16-dbb0-48ce-ab69-fba11373e67a","Type":"ContainerDied","Data":"fbe0b7416d29f07efca01c0abb7eb4bb90760cb7b1eb3da1d6d801e7fefa45f8"} Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.616070 4790 scope.go:117] "RemoveContainer" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.637422 4790 scope.go:117] "RemoveContainer" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.657741 4790 scope.go:117] "RemoveContainer" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.697632 4790 scope.go:117] "RemoveContainer" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" Mar 13 20:43:59 crc kubenswrapper[4790]: E0313 20:43:59.700093 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e\": container with ID starting with 340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e not found: ID does not exist" containerID="340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700143 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e"} err="failed to get container status \"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e\": rpc error: code = NotFound desc = could not find container \"340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e\": container with ID starting with 340da78a121a624414260105291f5a9e2190e265d915a7e7c8de2faa6ace7d8e not found: ID does not exist" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700172 4790 scope.go:117] "RemoveContainer" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" Mar 13 20:43:59 crc kubenswrapper[4790]: E0313 20:43:59.700464 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098\": container with ID starting with 799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098 not found: ID does not exist" containerID="799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700494 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098"} err="failed to get container status \"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098\": rpc error: code = NotFound desc = could not find container \"799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098\": container with ID starting with 799d9bb76bc5cb58ad39283b6b767a827b6f2eb34baf58bf7f4c86be001fc098 not found: ID does not exist" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700513 4790 scope.go:117] "RemoveContainer" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" Mar 13 20:43:59 crc kubenswrapper[4790]: E0313 20:43:59.700772 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d\": container with ID starting with 091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d not found: ID does not exist" containerID="091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.700800 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d"} err="failed to get container status \"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d\": rpc error: code = NotFound desc = could not find container \"091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d\": container with ID starting with 091e18889d4542386c444df0f094e476b4a8e73629907be774b41de3d879523d not found: ID does not exist" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.703301 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") pod \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.703488 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") pod \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.703543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") pod \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\" (UID: \"9d2e8f16-dbb0-48ce-ab69-fba11373e67a\") " Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.704390 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities" (OuterVolumeSpecName: "utilities") pod "9d2e8f16-dbb0-48ce-ab69-fba11373e67a" (UID: "9d2e8f16-dbb0-48ce-ab69-fba11373e67a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.711067 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh" (OuterVolumeSpecName: "kube-api-access-8kwrh") pod "9d2e8f16-dbb0-48ce-ab69-fba11373e67a" (UID: "9d2e8f16-dbb0-48ce-ab69-fba11373e67a"). InnerVolumeSpecName "kube-api-access-8kwrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.730918 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9d2e8f16-dbb0-48ce-ab69-fba11373e67a" (UID: "9d2e8f16-dbb0-48ce-ab69-fba11373e67a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.805118 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.805504 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kwrh\" (UniqueName: \"kubernetes.io/projected/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-kube-api-access-8kwrh\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.805525 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9d2e8f16-dbb0-48ce-ab69-fba11373e67a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.954138 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:43:59 crc kubenswrapper[4790]: I0313 20:43:59.965966 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kgpkk"] Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.154769 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:44:00 crc kubenswrapper[4790]: E0313 20:44:00.155265 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155287 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" Mar 13 20:44:00 crc kubenswrapper[4790]: E0313 20:44:00.155301 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-utilities" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155310 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-utilities" Mar 13 20:44:00 crc kubenswrapper[4790]: E0313 20:44:00.155323 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-content" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155335 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="extract-content" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.155550 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" containerName="registry-server" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.156241 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.158310 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.160046 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.160298 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.161677 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.310520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"auto-csr-approver-29557244-sndr9\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.411788 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"auto-csr-approver-29557244-sndr9\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.429849 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"auto-csr-approver-29557244-sndr9\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:00 crc kubenswrapper[4790]: I0313 20:44:00.475887 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:01 crc kubenswrapper[4790]: I0313 20:44:01.671395 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d2e8f16-dbb0-48ce-ab69-fba11373e67a" path="/var/lib/kubelet/pods/9d2e8f16-dbb0-48ce-ab69-fba11373e67a/volumes" Mar 13 20:44:02 crc kubenswrapper[4790]: I0313 20:44:02.940016 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.666980 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.667278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" event={"ID":"87b8083b-23ab-4733-a7ac-85bf1e565551","Type":"ContainerStarted","Data":"855ab8ac8ecde547b97686731e931e4ea878b7ff76196b393fca2fe9f0074695"} Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.667297 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557244-sndr9" event={"ID":"7f42b93e-6de8-423c-a2d5-dd57885de32c","Type":"ContainerStarted","Data":"d8f4e3de2382a875a84d1c776e64fd6b6600c72b4dbf64d6c90df645bb558dd6"} Mar 13 20:44:03 crc kubenswrapper[4790]: I0313 20:44:03.690151 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" podStartSLOduration=2.781983377 podStartE2EDuration="6.690132941s" podCreationTimestamp="2026-03-13 20:43:57 +0000 UTC" firstStartedPulling="2026-03-13 20:43:58.735078056 +0000 UTC m=+969.756193947" lastFinishedPulling="2026-03-13 20:44:02.64322762 +0000 UTC m=+973.664343511" observedRunningTime="2026-03-13 20:44:03.689184376 +0000 UTC m=+974.710300267" watchObservedRunningTime="2026-03-13 20:44:03.690132941 +0000 UTC m=+974.711248832" Mar 13 20:44:04 crc kubenswrapper[4790]: I0313 20:44:04.669903 4790 generic.go:334] "Generic (PLEG): container finished" podID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerID="721d15acd59eb0b2b9f8d48eaa51f02f0b2b5cc626d1243f5a398968f008ce5a" exitCode=0 Mar 13 20:44:04 crc kubenswrapper[4790]: I0313 20:44:04.670367 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557244-sndr9" event={"ID":"7f42b93e-6de8-423c-a2d5-dd57885de32c","Type":"ContainerDied","Data":"721d15acd59eb0b2b9f8d48eaa51f02f0b2b5cc626d1243f5a398968f008ce5a"} Mar 13 20:44:05 crc kubenswrapper[4790]: I0313 20:44:05.905101 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.096154 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") pod \"7f42b93e-6de8-423c-a2d5-dd57885de32c\" (UID: \"7f42b93e-6de8-423c-a2d5-dd57885de32c\") " Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.101202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn" (OuterVolumeSpecName: "kube-api-access-kwhhn") pod "7f42b93e-6de8-423c-a2d5-dd57885de32c" (UID: "7f42b93e-6de8-423c-a2d5-dd57885de32c"). InnerVolumeSpecName "kube-api-access-kwhhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.198885 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhhn\" (UniqueName: \"kubernetes.io/projected/7f42b93e-6de8-423c-a2d5-dd57885de32c-kube-api-access-kwhhn\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.682416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557244-sndr9" event={"ID":"7f42b93e-6de8-423c-a2d5-dd57885de32c","Type":"ContainerDied","Data":"d8f4e3de2382a875a84d1c776e64fd6b6600c72b4dbf64d6c90df645bb558dd6"} Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.682472 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8f4e3de2382a875a84d1c776e64fd6b6600c72b4dbf64d6c90df645bb558dd6" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.682516 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557244-sndr9" Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.954018 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:44:06 crc kubenswrapper[4790]: I0313 20:44:06.966801 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557238-jx8wj"] Mar 13 20:44:07 crc kubenswrapper[4790]: I0313 20:44:07.669948 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c63bf97-e702-439a-8f3b-58d4496c91b9" path="/var/lib/kubelet/pods/6c63bf97-e702-439a-8f3b-58d4496c91b9/volumes" Mar 13 20:44:08 crc kubenswrapper[4790]: I0313 20:44:08.310078 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5c46d6fb64-bj72t" Mar 13 20:44:14 crc kubenswrapper[4790]: I0313 20:44:14.018494 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:44:14 crc kubenswrapper[4790]: I0313 20:44:14.018800 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.285416 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:27 crc kubenswrapper[4790]: E0313 20:44:27.286256 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerName="oc" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.286272 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerName="oc" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.286433 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" containerName="oc" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.287395 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.301031 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.403508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.403825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.403987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.505661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.505715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.505758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.506223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.506442 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.526305 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"certified-operators-jgrz9\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.614004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:27 crc kubenswrapper[4790]: I0313 20:44:27.881186 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.823016 4790 generic.go:334] "Generic (PLEG): container finished" podID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerID="13883d616d7859b8b1f4e3643b2470ceb4a60d0faba96109c31a1ecc31533caa" exitCode=0 Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.823447 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"13883d616d7859b8b1f4e3643b2470ceb4a60d0faba96109c31a1ecc31533caa"} Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.823477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerStarted","Data":"94c84ec1662023adbd79b891587ec02bac606782a1b69fbe98e2395146aadf04"} Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.888837 4790 scope.go:117] "RemoveContainer" containerID="a1eeddc06106c1113c4a31e23128dada69c832330fa1711ed5544055f1b4392f" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.954392 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8p67"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.955463 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.957711 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-c789s" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.961272 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.962085 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.965730 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-xm7hh" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.969647 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8p67"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.980027 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.992028 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9"] Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.992935 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:28 crc kubenswrapper[4790]: I0313 20:44:28.995727 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6gbht" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.008457 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.054580 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.056208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.056313 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.061590 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-zth67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.089497 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.090464 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.093648 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-8wnbp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.099804 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.111278 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.112467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.119996 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-9wgrk" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.120336 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.140006 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.140882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.145724 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.149532 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-km8xp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.152560 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.153362 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.154884 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-nc2qj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164083 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sznf\" (UniqueName: \"kubernetes.io/projected/bdbe5269-1150-4269-bc28-1d719f1b77b6-kube-api-access-7sznf\") pod \"barbican-operator-controller-manager-d47688694-s8p67\" (UID: \"bdbe5269-1150-4269-bc28-1d719f1b77b6\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzgl8\" (UniqueName: \"kubernetes.io/projected/dd8df218-c492-4e48-93a9-f5f2dbf7fc00-kube-api-access-rzgl8\") pod \"cinder-operator-controller-manager-984cd4dcf-5plwh\" (UID: \"dd8df218-c492-4e48-93a9-f5f2dbf7fc00\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxdfb\" (UniqueName: \"kubernetes.io/projected/46fb44a5-f567-4f58-80b1-dd70694f9339-kube-api-access-xxdfb\") pod \"designate-operator-controller-manager-66d56f6ff4-h7rc9\" (UID: \"46fb44a5-f567-4f58-80b1-dd70694f9339\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.164994 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.165977 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.176915 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-2lhj2" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.184874 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.194661 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.198000 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.201792 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-n6tgh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.224770 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.233510 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.261016 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.261805 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.265276 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-8lttf" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.265987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5j9wn\" (UniqueName: \"kubernetes.io/projected/460b6997-f558-4e5f-9e15-aa33fece4f4b-kube-api-access-5j9wn\") pod \"horizon-operator-controller-manager-6d9d6b584d-nzdzx\" (UID: \"460b6997-f558-4e5f-9e15-aa33fece4f4b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266026 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbqlp\" (UniqueName: \"kubernetes.io/projected/77f24ce6-bc52-4831-902c-255983a8f911-kube-api-access-sbqlp\") pod \"keystone-operator-controller-manager-684f77d66d-5vcsg\" (UID: \"77f24ce6-bc52-4831-902c-255983a8f911\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzgl8\" (UniqueName: \"kubernetes.io/projected/dd8df218-c492-4e48-93a9-f5f2dbf7fc00-kube-api-access-rzgl8\") pod \"cinder-operator-controller-manager-984cd4dcf-5plwh\" (UID: \"dd8df218-c492-4e48-93a9-f5f2dbf7fc00\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxdfb\" (UniqueName: \"kubernetes.io/projected/46fb44a5-f567-4f58-80b1-dd70694f9339-kube-api-access-xxdfb\") pod \"designate-operator-controller-manager-66d56f6ff4-h7rc9\" (UID: \"46fb44a5-f567-4f58-80b1-dd70694f9339\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266172 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8qt2\" (UniqueName: \"kubernetes.io/projected/7caf7136-8a46-410b-8a32-72ab19e8baca-kube-api-access-h8qt2\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sznf\" (UniqueName: \"kubernetes.io/projected/bdbe5269-1150-4269-bc28-1d719f1b77b6-kube-api-access-7sznf\") pod \"barbican-operator-controller-manager-d47688694-s8p67\" (UID: \"bdbe5269-1150-4269-bc28-1d719f1b77b6\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266277 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krblr\" (UniqueName: \"kubernetes.io/projected/e154cc44-2769-4bfe-b8ef-3f6c56f08f74-kube-api-access-krblr\") pod \"glance-operator-controller-manager-5964f64c48-tzx96\" (UID: \"e154cc44-2769-4bfe-b8ef-3f6c56f08f74\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlzvf\" (UniqueName: \"kubernetes.io/projected/a7488d00-50bc-4ce8-ae0a-8d3ff807c0da-kube-api-access-zlzvf\") pod \"heat-operator-controller-manager-77b6666d85-q5nj7\" (UID: \"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.266598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr89w\" (UniqueName: \"kubernetes.io/projected/2747d064-d45f-4a4e-87c2-d2c9f82eac10-kube-api-access-zr89w\") pod \"ironic-operator-controller-manager-5bc894d9b-wfltj\" (UID: \"2747d064-d45f-4a4e-87c2-d2c9f82eac10\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.275202 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.280738 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.283225 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.292265 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.293787 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-8685q" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.311253 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.312050 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.315430 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-s96ts" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.323721 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxdfb\" (UniqueName: \"kubernetes.io/projected/46fb44a5-f567-4f58-80b1-dd70694f9339-kube-api-access-xxdfb\") pod \"designate-operator-controller-manager-66d56f6ff4-h7rc9\" (UID: \"46fb44a5-f567-4f58-80b1-dd70694f9339\") " pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.324071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sznf\" (UniqueName: \"kubernetes.io/projected/bdbe5269-1150-4269-bc28-1d719f1b77b6-kube-api-access-7sznf\") pod \"barbican-operator-controller-manager-d47688694-s8p67\" (UID: \"bdbe5269-1150-4269-bc28-1d719f1b77b6\") " pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.338348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzgl8\" (UniqueName: \"kubernetes.io/projected/dd8df218-c492-4e48-93a9-f5f2dbf7fc00-kube-api-access-rzgl8\") pod \"cinder-operator-controller-manager-984cd4dcf-5plwh\" (UID: \"dd8df218-c492-4e48-93a9-f5f2dbf7fc00\") " pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.338737 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.351887 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.366556 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5j9wn\" (UniqueName: \"kubernetes.io/projected/460b6997-f558-4e5f-9e15-aa33fece4f4b-kube-api-access-5j9wn\") pod \"horizon-operator-controller-manager-6d9d6b584d-nzdzx\" (UID: \"460b6997-f558-4e5f-9e15-aa33fece4f4b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnq54\" (UniqueName: \"kubernetes.io/projected/5befe4e4-4574-42ac-90ce-ac67c1e33eee-kube-api-access-wnq54\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v\" (UID: \"5befe4e4-4574-42ac-90ce-ac67c1e33eee\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367698 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbqlp\" (UniqueName: \"kubernetes.io/projected/77f24ce6-bc52-4831-902c-255983a8f911-kube-api-access-sbqlp\") pod \"keystone-operator-controller-manager-684f77d66d-5vcsg\" (UID: \"77f24ce6-bc52-4831-902c-255983a8f911\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367774 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsbbn\" (UniqueName: \"kubernetes.io/projected/b5a018c4-3e3a-4f77-a272-20c94a5b9c7a-kube-api-access-fsbbn\") pod \"manila-operator-controller-manager-57b484b4df-hlk9s\" (UID: \"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367887 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8qt2\" (UniqueName: \"kubernetes.io/projected/7caf7136-8a46-410b-8a32-72ab19e8baca-kube-api-access-h8qt2\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367978 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.367981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krblr\" (UniqueName: \"kubernetes.io/projected/e154cc44-2769-4bfe-b8ef-3f6c56f08f74-kube-api-access-krblr\") pod \"glance-operator-controller-manager-5964f64c48-tzx96\" (UID: \"e154cc44-2769-4bfe-b8ef-3f6c56f08f74\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.368919 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.369016 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlzvf\" (UniqueName: \"kubernetes.io/projected/a7488d00-50bc-4ce8-ae0a-8d3ff807c0da-kube-api-access-zlzvf\") pod \"heat-operator-controller-manager-77b6666d85-q5nj7\" (UID: \"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.369091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr89w\" (UniqueName: \"kubernetes.io/projected/2747d064-d45f-4a4e-87c2-d2c9f82eac10-kube-api-access-zr89w\") pod \"ironic-operator-controller-manager-5bc894d9b-wfltj\" (UID: \"2747d064-d45f-4a4e-87c2-d2c9f82eac10\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.369022 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.369233 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:29.869209515 +0000 UTC m=+1000.890325406 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.373717 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6h66p" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.392792 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.404258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbqlp\" (UniqueName: \"kubernetes.io/projected/77f24ce6-bc52-4831-902c-255983a8f911-kube-api-access-sbqlp\") pod \"keystone-operator-controller-manager-684f77d66d-5vcsg\" (UID: \"77f24ce6-bc52-4831-902c-255983a8f911\") " pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.406267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krblr\" (UniqueName: \"kubernetes.io/projected/e154cc44-2769-4bfe-b8ef-3f6c56f08f74-kube-api-access-krblr\") pod \"glance-operator-controller-manager-5964f64c48-tzx96\" (UID: \"e154cc44-2769-4bfe-b8ef-3f6c56f08f74\") " pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.406810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr89w\" (UniqueName: \"kubernetes.io/projected/2747d064-d45f-4a4e-87c2-d2c9f82eac10-kube-api-access-zr89w\") pod \"ironic-operator-controller-manager-5bc894d9b-wfltj\" (UID: \"2747d064-d45f-4a4e-87c2-d2c9f82eac10\") " pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.406926 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5j9wn\" (UniqueName: \"kubernetes.io/projected/460b6997-f558-4e5f-9e15-aa33fece4f4b-kube-api-access-5j9wn\") pod \"horizon-operator-controller-manager-6d9d6b584d-nzdzx\" (UID: \"460b6997-f558-4e5f-9e15-aa33fece4f4b\") " pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.409431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8qt2\" (UniqueName: \"kubernetes.io/projected/7caf7136-8a46-410b-8a32-72ab19e8baca-kube-api-access-h8qt2\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.412018 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlzvf\" (UniqueName: \"kubernetes.io/projected/a7488d00-50bc-4ce8-ae0a-8d3ff807c0da-kube-api-access-zlzvf\") pod \"heat-operator-controller-manager-77b6666d85-q5nj7\" (UID: \"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da\") " pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.421756 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.432711 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.433804 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.439609 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.440125 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-rxwr4" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.446279 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.447172 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.452795 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-9fht8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.464951 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470099 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2hzf\" (UniqueName: \"kubernetes.io/projected/403c2990-8871-47da-abd8-8c9fc5753d54-kube-api-access-g2hzf\") pod \"octavia-operator-controller-manager-5f4f55cb5c-tbbfl\" (UID: \"403c2990-8871-47da-abd8-8c9fc5753d54\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470217 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jbvd\" (UniqueName: \"kubernetes.io/projected/499aa973-6f5e-4229-9282-52c4fbf0625f-kube-api-access-7jbvd\") pod \"neutron-operator-controller-manager-776c5696bf-dxntp\" (UID: \"499aa973-6f5e-4229-9282-52c4fbf0625f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnq54\" (UniqueName: \"kubernetes.io/projected/5befe4e4-4574-42ac-90ce-ac67c1e33eee-kube-api-access-wnq54\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v\" (UID: \"5befe4e4-4574-42ac-90ce-ac67c1e33eee\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470326 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsbbn\" (UniqueName: \"kubernetes.io/projected/b5a018c4-3e3a-4f77-a272-20c94a5b9c7a-kube-api-access-fsbbn\") pod \"manila-operator-controller-manager-57b484b4df-hlk9s\" (UID: \"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.470355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpxs5\" (UniqueName: \"kubernetes.io/projected/386f7e46-c2e3-4eae-aa82-05075883c889-kube-api-access-cpxs5\") pod \"nova-operator-controller-manager-7f84474648-b8lpj\" (UID: \"386f7e46-c2e3-4eae-aa82-05075883c889\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.481657 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.482642 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.492673 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-5zqv4" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.497591 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsbbn\" (UniqueName: \"kubernetes.io/projected/b5a018c4-3e3a-4f77-a272-20c94a5b9c7a-kube-api-access-fsbbn\") pod \"manila-operator-controller-manager-57b484b4df-hlk9s\" (UID: \"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a\") " pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.503235 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnq54\" (UniqueName: \"kubernetes.io/projected/5befe4e4-4574-42ac-90ce-ac67c1e33eee-kube-api-access-wnq54\") pod \"mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v\" (UID: \"5befe4e4-4574-42ac-90ce-ac67c1e33eee\") " pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.503288 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.525004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.525658 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.529601 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.540041 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-nt4tx" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.543326 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.552454 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.590768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp9hr\" (UniqueName: \"kubernetes.io/projected/b1273818-139a-4213-b23c-609a7305c92f-kube-api-access-sp9hr\") pod \"ovn-operator-controller-manager-bbc5b68f9-hwdv8\" (UID: \"b1273818-139a-4213-b23c-609a7305c92f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.590844 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.591350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sslbn\" (UniqueName: \"kubernetes.io/projected/b36f993b-25cd-4f12-bf48-77bf6f4cf26b-kube-api-access-sslbn\") pod \"placement-operator-controller-manager-574d45c66c-c9lbv\" (UID: \"b36f993b-25cd-4f12-bf48-77bf6f4cf26b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.591768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28d98\" (UniqueName: \"kubernetes.io/projected/5622f52e-2e94-41ca-a9d2-a0c833895937-kube-api-access-28d98\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.592207 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2hzf\" (UniqueName: \"kubernetes.io/projected/403c2990-8871-47da-abd8-8c9fc5753d54-kube-api-access-g2hzf\") pod \"octavia-operator-controller-manager-5f4f55cb5c-tbbfl\" (UID: \"403c2990-8871-47da-abd8-8c9fc5753d54\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.592744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7jbvd\" (UniqueName: \"kubernetes.io/projected/499aa973-6f5e-4229-9282-52c4fbf0625f-kube-api-access-7jbvd\") pod \"neutron-operator-controller-manager-776c5696bf-dxntp\" (UID: \"499aa973-6f5e-4229-9282-52c4fbf0625f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.592874 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpxs5\" (UniqueName: \"kubernetes.io/projected/386f7e46-c2e3-4eae-aa82-05075883c889-kube-api-access-cpxs5\") pod \"nova-operator-controller-manager-7f84474648-b8lpj\" (UID: \"386f7e46-c2e3-4eae-aa82-05075883c889\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.593233 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.597770 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.598292 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.598453 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.609841 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.615158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2hzf\" (UniqueName: \"kubernetes.io/projected/403c2990-8871-47da-abd8-8c9fc5753d54-kube-api-access-g2hzf\") pod \"octavia-operator-controller-manager-5f4f55cb5c-tbbfl\" (UID: \"403c2990-8871-47da-abd8-8c9fc5753d54\") " pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.618866 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.623105 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jbvd\" (UniqueName: \"kubernetes.io/projected/499aa973-6f5e-4229-9282-52c4fbf0625f-kube-api-access-7jbvd\") pod \"neutron-operator-controller-manager-776c5696bf-dxntp\" (UID: \"499aa973-6f5e-4229-9282-52c4fbf0625f\") " pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.629324 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.632426 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpxs5\" (UniqueName: \"kubernetes.io/projected/386f7e46-c2e3-4eae-aa82-05075883c889-kube-api-access-cpxs5\") pod \"nova-operator-controller-manager-7f84474648-b8lpj\" (UID: \"386f7e46-c2e3-4eae-aa82-05075883c889\") " pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.663606 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.676925 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.677733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.680923 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-ls2zb" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.700196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sslbn\" (UniqueName: \"kubernetes.io/projected/b36f993b-25cd-4f12-bf48-77bf6f4cf26b-kube-api-access-sslbn\") pod \"placement-operator-controller-manager-574d45c66c-c9lbv\" (UID: \"b36f993b-25cd-4f12-bf48-77bf6f4cf26b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.700420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28d98\" (UniqueName: \"kubernetes.io/projected/5622f52e-2e94-41ca-a9d2-a0c833895937-kube-api-access-28d98\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.704778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.706034 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.706212 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.707311 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.708277 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkr4g\" (UniqueName: \"kubernetes.io/projected/0244e4ae-2ccd-482a-b490-58a8e46ab53d-kube-api-access-zkr4g\") pod \"swift-operator-controller-manager-7f9cc5dd44-ppzzz\" (UID: \"0244e4ae-2ccd-482a-b490-58a8e46ab53d\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.709584 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sp9hr\" (UniqueName: \"kubernetes.io/projected/b1273818-139a-4213-b23c-609a7305c92f-kube-api-access-sp9hr\") pod \"ovn-operator-controller-manager-bbc5b68f9-hwdv8\" (UID: \"b1273818-139a-4213-b23c-609a7305c92f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.709648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.709774 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.709835 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.209822025 +0000 UTC m=+1001.230937916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.711427 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.714197 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-kbjrq" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.728748 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sslbn\" (UniqueName: \"kubernetes.io/projected/b36f993b-25cd-4f12-bf48-77bf6f4cf26b-kube-api-access-sslbn\") pod \"placement-operator-controller-manager-574d45c66c-c9lbv\" (UID: \"b36f993b-25cd-4f12-bf48-77bf6f4cf26b\") " pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.746190 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp9hr\" (UniqueName: \"kubernetes.io/projected/b1273818-139a-4213-b23c-609a7305c92f-kube-api-access-sp9hr\") pod \"ovn-operator-controller-manager-bbc5b68f9-hwdv8\" (UID: \"b1273818-139a-4213-b23c-609a7305c92f\") " pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.747676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28d98\" (UniqueName: \"kubernetes.io/projected/5622f52e-2e94-41ca-a9d2-a0c833895937-kube-api-access-28d98\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.750398 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.797006 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.816455 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.817421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.818834 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkr4g\" (UniqueName: \"kubernetes.io/projected/0244e4ae-2ccd-482a-b490-58a8e46ab53d-kube-api-access-zkr4g\") pod \"swift-operator-controller-manager-7f9cc5dd44-ppzzz\" (UID: \"0244e4ae-2ccd-482a-b490-58a8e46ab53d\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.818888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ql5n\" (UniqueName: \"kubernetes.io/projected/2032df10-91a5-4a88-9705-c355f50a5024-kube-api-access-5ql5n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-f8l4s\" (UID: \"2032df10-91a5-4a88-9705-c355f50a5024\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.829642 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-dsggs" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.830573 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.858491 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.859559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.864479 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.864680 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.864888 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hzb9g" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.866788 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.868244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkr4g\" (UniqueName: \"kubernetes.io/projected/0244e4ae-2ccd-482a-b490-58a8e46ab53d-kube-api-access-zkr4g\") pod \"swift-operator-controller-manager-7f9cc5dd44-ppzzz\" (UID: \"0244e4ae-2ccd-482a-b490-58a8e46ab53d\") " pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.873983 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.880096 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.917140 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920560 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920655 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbwnz\" (UniqueName: \"kubernetes.io/projected/47bdfeda-c97a-40b5-82f8-1008ba20e75b-kube-api-access-rbwnz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-5689f\" (UID: \"47bdfeda-c97a-40b5-82f8-1008ba20e75b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9zgd\" (UniqueName: \"kubernetes.io/projected/a36ba835-deb4-41f5-9b6a-57d1e577c8b1-kube-api-access-q9zgd\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfb9g\" (UID: \"a36ba835-deb4-41f5-9b6a-57d1e577c8b1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920718 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920749 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ql5n\" (UniqueName: \"kubernetes.io/projected/2032df10-91a5-4a88-9705-c355f50a5024-kube-api-access-5ql5n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-f8l4s\" (UID: \"2032df10-91a5-4a88-9705-c355f50a5024\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.920823 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdqpb\" (UniqueName: \"kubernetes.io/projected/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-kube-api-access-sdqpb\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.920976 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: E0313 20:44:29.921035 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.921017835 +0000 UTC m=+1001.942133726 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.946023 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.946873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.949247 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ql5n\" (UniqueName: \"kubernetes.io/projected/2032df10-91a5-4a88-9705-c355f50a5024-kube-api-access-5ql5n\") pod \"telemetry-operator-controller-manager-6854b8b9d9-f8l4s\" (UID: \"2032df10-91a5-4a88-9705-c355f50a5024\") " pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.952680 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9"] Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.956124 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-sn67s" Mar 13 20:44:29 crc kubenswrapper[4790]: W0313 20:44:29.961490 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7488d00_50bc_4ce8_ae0a_8d3ff807c0da.slice/crio-88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b WatchSource:0}: Error finding container 88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b: Status 404 returned error can't find the container with id 88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b Mar 13 20:44:29 crc kubenswrapper[4790]: I0313 20:44:29.985095 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.023899 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdqpb\" (UniqueName: \"kubernetes.io/projected/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-kube-api-access-sdqpb\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024637 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024714 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbwnz\" (UniqueName: \"kubernetes.io/projected/47bdfeda-c97a-40b5-82f8-1008ba20e75b-kube-api-access-rbwnz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-5689f\" (UID: \"47bdfeda-c97a-40b5-82f8-1008ba20e75b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.024747 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9zgd\" (UniqueName: \"kubernetes.io/projected/a36ba835-deb4-41f5-9b6a-57d1e577c8b1-kube-api-access-q9zgd\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfb9g\" (UID: \"a36ba835-deb4-41f5-9b6a-57d1e577c8b1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.024821 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.024906 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.524882823 +0000 UTC m=+1001.545998784 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.025071 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.025101 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:30.525092089 +0000 UTC m=+1001.546208080 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.057296 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbwnz\" (UniqueName: \"kubernetes.io/projected/47bdfeda-c97a-40b5-82f8-1008ba20e75b-kube-api-access-rbwnz\") pod \"watcher-operator-controller-manager-6c4d75f7f9-5689f\" (UID: \"47bdfeda-c97a-40b5-82f8-1008ba20e75b\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.067177 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9zgd\" (UniqueName: \"kubernetes.io/projected/a36ba835-deb4-41f5-9b6a-57d1e577c8b1-kube-api-access-q9zgd\") pod \"test-operator-controller-manager-5c5cb9c4d7-cfb9g\" (UID: \"a36ba835-deb4-41f5-9b6a-57d1e577c8b1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.090038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdqpb\" (UniqueName: \"kubernetes.io/projected/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-kube-api-access-sdqpb\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.125964 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j87jp\" (UniqueName: \"kubernetes.io/projected/22e6d110-bd87-4d28-851d-307b4223ee8f-kube-api-access-j87jp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvrl9\" (UID: \"22e6d110-bd87-4d28-851d-307b4223ee8f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.167648 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.179221 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.190663 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.228203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.228319 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j87jp\" (UniqueName: \"kubernetes.io/projected/22e6d110-bd87-4d28-851d-307b4223ee8f-kube-api-access-j87jp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvrl9\" (UID: \"22e6d110-bd87-4d28-851d-307b4223ee8f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.228442 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.228486 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:31.228472655 +0000 UTC m=+1002.249588546 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.238779 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod460b6997_f558_4e5f_9e15_aa33fece4f4b.slice/crio-69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20 WatchSource:0}: Error finding container 69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20: Status 404 returned error can't find the container with id 69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.249955 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j87jp\" (UniqueName: \"kubernetes.io/projected/22e6d110-bd87-4d28-851d-307b4223ee8f-kube-api-access-j87jp\") pod \"rabbitmq-cluster-operator-manager-668c99d594-xvrl9\" (UID: \"22e6d110-bd87-4d28-851d-307b4223ee8f\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.341844 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.370144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.457050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.465980 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.471264 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-d47688694-s8p67"] Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.516549 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a018c4_3e3a_4f77_a272_20c94a5b9c7a.slice/crio-2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae WatchSource:0}: Error finding container 2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae: Status 404 returned error can't find the container with id 2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.538869 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbe5269_1150_4269_bc28_1d719f1b77b6.slice/crio-7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5 WatchSource:0}: Error finding container 7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5: Status 404 returned error can't find the container with id 7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.552832 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.552938 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553124 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553189 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:31.553174804 +0000 UTC m=+1002.574290695 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553668 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.553702 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:31.553693619 +0000 UTC m=+1002.574809510 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.597478 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.623141 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.843330 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v"] Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.857661 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96"] Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.874988 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5befe4e4_4574_42ac_90ce_ac67c1e33eee.slice/crio-fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971 WatchSource:0}: Error finding container fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971: Status 404 returned error can't find the container with id fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.876996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" event={"ID":"dd8df218-c492-4e48-93a9-f5f2dbf7fc00","Type":"ContainerStarted","Data":"c9228b2b9361a604f58db4a2eef1f2e90da985b067f1754db68fef1126f2ae05"} Mar 13 20:44:30 crc kubenswrapper[4790]: W0313 20:44:30.878727 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode154cc44_2769_4bfe_b8ef_3f6c56f08f74.slice/crio-750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673 WatchSource:0}: Error finding container 750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673: Status 404 returned error can't find the container with id 750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.879253 4790 generic.go:334] "Generic (PLEG): container finished" podID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerID="e192663f06dfb187428edb1e170aca9856113025c267275042a42fcf172697f7" exitCode=0 Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.879323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"e192663f06dfb187428edb1e170aca9856113025c267275042a42fcf172697f7"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.880322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" event={"ID":"77f24ce6-bc52-4831-902c-255983a8f911","Type":"ContainerStarted","Data":"dbd1091664789a5509e9f7034d0926af7220a21a6ea02635b0173a9b600c6c2d"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.882248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" event={"ID":"460b6997-f558-4e5f-9e15-aa33fece4f4b","Type":"ContainerStarted","Data":"69c2a352d559833439b415b9bf356ed965314e08515de94a727a7f835b101c20"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.888367 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" event={"ID":"bdbe5269-1150-4269-bc28-1d719f1b77b6","Type":"ContainerStarted","Data":"7af45bb6d7ee3106bf724a5b0032db8dccd6431f9e2fee47deb8f7445183d5a5"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.894845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" event={"ID":"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a","Type":"ContainerStarted","Data":"2c3a00a361a645198b14c86bc083a1da8e0389c6ab64a155018fc0af31c1a4ae"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.896536 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" event={"ID":"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da","Type":"ContainerStarted","Data":"88314e17948ded46962315cdf0d51b36233b450d17b62ec8aa726fd2277a480b"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.900312 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" event={"ID":"46fb44a5-f567-4f58-80b1-dd70694f9339","Type":"ContainerStarted","Data":"b5ba93f1f107573b0d70724ba7c4e4057f7c5ac0a7a5d323a07d9c2217790481"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.901696 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" event={"ID":"2747d064-d45f-4a4e-87c2-d2c9f82eac10","Type":"ContainerStarted","Data":"ea6092c7addb8875e9e417beb7252e64ffc37b00aab65b67c59881bb119fb4a5"} Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.961049 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.961225 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: E0313 20:44:30.972324 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:32.972280685 +0000 UTC m=+1003.993396666 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:30 crc kubenswrapper[4790]: I0313 20:44:30.973651 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.007743 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.013911 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl"] Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.025605 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod499aa973_6f5e_4229_9282_52c4fbf0625f.slice/crio-73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8 WatchSource:0}: Error finding container 73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8: Status 404 returned error can't find the container with id 73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8 Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.028571 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod403c2990_8871_47da_abd8_8c9fc5753d54.slice/crio-fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7 WatchSource:0}: Error finding container fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7: Status 404 returned error can't find the container with id fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7 Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.105133 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.113328 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.122805 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.132543 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz"] Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.137581 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sp9hr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-bbc5b68f9-hwdv8_openstack-operators(b1273818-139a-4213-b23c-609a7305c92f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.139292 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podUID="b1273818-139a-4213-b23c-609a7305c92f" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.144487 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s"] Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.149172 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2032df10_91a5_4a88_9705_c355f50a5024.slice/crio-5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662 WatchSource:0}: Error finding container 5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662: Status 404 returned error can't find the container with id 5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662 Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.151967 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g"] Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.152121 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb36f993b_25cd_4f12_bf48_77bf6f4cf26b.slice/crio-f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16 WatchSource:0}: Error finding container f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16: Status 404 returned error can't find the container with id f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16 Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.154732 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5ql5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-6854b8b9d9-f8l4s_openstack-operators(2032df10-91a5-4a88-9705-c355f50a5024): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.154740 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sslbn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-574d45c66c-c9lbv_openstack-operators(b36f993b-25cd-4f12-bf48-77bf6f4cf26b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.155922 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podUID="b36f993b-25cd-4f12-bf48-77bf6f4cf26b" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.155926 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podUID="2032df10-91a5-4a88-9705-c355f50a5024" Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.172141 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47bdfeda_c97a_40b5_82f8_1008ba20e75b.slice/crio-fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061 WatchSource:0}: Error finding container fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061: Status 404 returned error can't find the container with id fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061 Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.175856 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rbwnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-5689f_openstack-operators(47bdfeda-c97a-40b5-82f8-1008ba20e75b): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: W0313 20:44:31.176745 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0244e4ae_2ccd_482a_b490_58a8e46ab53d.slice/crio-1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8 WatchSource:0}: Error finding container 1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8: Status 404 returned error can't find the container with id 1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8 Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.178527 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podUID="47bdfeda-c97a-40b5-82f8-1008ba20e75b" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.183194 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zkr4g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-7f9cc5dd44-ppzzz_openstack-operators(0244e4ae-2ccd-482a-b490-58a8e46ab53d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.184444 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podUID="0244e4ae-2ccd-482a-b490-58a8e46ab53d" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.252448 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9"] Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.266535 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.267035 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.267157 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:33.267138833 +0000 UTC m=+1004.288254724 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.569771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.569827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.569967 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.569995 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.570058 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:33.570038921 +0000 UTC m=+1004.591154812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.570076 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:33.570067851 +0000 UTC m=+1004.591183742 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.937864 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerStarted","Data":"09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24"} Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.940987 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" event={"ID":"0244e4ae-2ccd-482a-b490-58a8e46ab53d","Type":"ContainerStarted","Data":"1494c806ff22ad950dfdb3f564467ae8345bdbaefb552e163d60a947b73c30d8"} Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.943133 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podUID="0244e4ae-2ccd-482a-b490-58a8e46ab53d" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.950362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" event={"ID":"b1273818-139a-4213-b23c-609a7305c92f","Type":"ContainerStarted","Data":"e7020105e4517bfbe1bb13af02b8b25abb6552ac1ae5b914868dd911d9396e64"} Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.952445 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podUID="b1273818-139a-4213-b23c-609a7305c92f" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.958177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" event={"ID":"499aa973-6f5e-4229-9282-52c4fbf0625f","Type":"ContainerStarted","Data":"73fc5045fc738d9e6c72f4868b4432b0c6069a1457ba333c45b480798f7238e8"} Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.961083 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jgrz9" podStartSLOduration=2.17809555 podStartE2EDuration="4.961061969s" podCreationTimestamp="2026-03-13 20:44:27 +0000 UTC" firstStartedPulling="2026-03-13 20:44:28.82460865 +0000 UTC m=+999.845724541" lastFinishedPulling="2026-03-13 20:44:31.607575069 +0000 UTC m=+1002.628690960" observedRunningTime="2026-03-13 20:44:31.95631032 +0000 UTC m=+1002.977426231" watchObservedRunningTime="2026-03-13 20:44:31.961061969 +0000 UTC m=+1002.982177860" Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.978411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" event={"ID":"403c2990-8871-47da-abd8-8c9fc5753d54","Type":"ContainerStarted","Data":"fa0608dd8880d0583e2815e607f5a86d97edf6584168ec85743b47e6235a63d7"} Mar 13 20:44:31 crc kubenswrapper[4790]: I0313 20:44:31.979890 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" event={"ID":"b36f993b-25cd-4f12-bf48-77bf6f4cf26b","Type":"ContainerStarted","Data":"f2801e97683362807e0323a55a8dc52073a52f4ba738c701cf75ed3c5abf5c16"} Mar 13 20:44:31 crc kubenswrapper[4790]: E0313 20:44:31.992255 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podUID="b36f993b-25cd-4f12-bf48-77bf6f4cf26b" Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:31.996580 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" event={"ID":"a36ba835-deb4-41f5-9b6a-57d1e577c8b1","Type":"ContainerStarted","Data":"7679f460ba89c0e1f9a653df373d65029dac58b10d4b33dbf370a9bd2ca1a341"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.012502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" event={"ID":"22e6d110-bd87-4d28-851d-307b4223ee8f","Type":"ContainerStarted","Data":"22032535d0f3b23460ceb619582eb046ee9d8684867e59095b92daf1e309bab0"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.014279 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" event={"ID":"2032df10-91a5-4a88-9705-c355f50a5024","Type":"ContainerStarted","Data":"5e79e6db9e6e6c63b246b4f17677fd322e65da34837527d519d091d2e36b1662"} Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.016856 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podUID="2032df10-91a5-4a88-9705-c355f50a5024" Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.023476 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" event={"ID":"386f7e46-c2e3-4eae-aa82-05075883c889","Type":"ContainerStarted","Data":"8826287611f4404be5662201187022cde01073b80013348065ac5caa09fc4463"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.026414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" event={"ID":"47bdfeda-c97a-40b5-82f8-1008ba20e75b","Type":"ContainerStarted","Data":"fe3bc9d5e65108f137424ae3d888a49148b2d3b8e9b886d23169b747c049b061"} Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.028416 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podUID="47bdfeda-c97a-40b5-82f8-1008ba20e75b" Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.044855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" event={"ID":"5befe4e4-4574-42ac-90ce-ac67c1e33eee","Type":"ContainerStarted","Data":"fdeb90e491c03789dbf99c55d2d74776a14b8b2833d300849a02386f3ba72971"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.054074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" event={"ID":"e154cc44-2769-4bfe-b8ef-3f6c56f08f74","Type":"ContainerStarted","Data":"750e36aae99028ac45b5d8402f6a8b525f1d8da757fcd5c61374d3536cda5673"} Mar 13 20:44:32 crc kubenswrapper[4790]: I0313 20:44:32.987558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.987767 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:32 crc kubenswrapper[4790]: E0313 20:44:32.987813 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:36.987798833 +0000 UTC m=+1008.008914724 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.065202 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:2f63ddf5c95c6c82f6e04bc9f7f20d56dc003614647726ab00276239eec40b7f\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podUID="b1273818-139a-4213-b23c-609a7305c92f" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.065206 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:72db77c98e7bca64d469b4dc316e9c8d329681f825d19ef8f333437fb1c6d3f5\\\"\"" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podUID="0244e4ae-2ccd-482a-b490-58a8e46ab53d" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.066561 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:e7e865363955c670e41b6c042c4f87abceff78f5495ba5c5c82988baad45c978\\\"\"" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podUID="b36f993b-25cd-4f12-bf48-77bf6f4cf26b" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.066708 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:3a0fc90da4caf7412ae01e21542b53a10fe7a2732a705b0ae83f926d72c7332a\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podUID="2032df10-91a5-4a88-9705-c355f50a5024" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.066800 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podUID="47bdfeda-c97a-40b5-82f8-1008ba20e75b" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.291531 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.291657 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:37.291618426 +0000 UTC m=+1008.312734317 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: I0313 20:44:33.291741 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:33 crc kubenswrapper[4790]: I0313 20:44:33.596297 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.596486 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.596569 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:37.596550478 +0000 UTC m=+1008.617666369 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: I0313 20:44:33.596904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.597026 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:33 crc kubenswrapper[4790]: E0313 20:44:33.597062 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:37.597052902 +0000 UTC m=+1008.618168793 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.051997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.052168 4790 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.052438 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert podName:7caf7136-8a46-410b-8a32-72ab19e8baca nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.052419101 +0000 UTC m=+1016.073534992 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert") pod "infra-operator-controller-manager-54dc5b8f8d-jrr7h" (UID: "7caf7136-8a46-410b-8a32-72ab19e8baca") : secret "infra-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.356679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.356858 4790 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.356935 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert podName:5622f52e-2e94-41ca-a9d2-a0c833895937 nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.356916752 +0000 UTC m=+1016.378032643 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert") pod "openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" (UID: "5622f52e-2e94-41ca-a9d2-a0c833895937") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.614719 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.615048 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.657751 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.660190 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:37 crc kubenswrapper[4790]: I0313 20:44:37.660250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660323 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660417 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.660397485 +0000 UTC m=+1016.681513386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660323 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:37 crc kubenswrapper[4790]: E0313 20:44:37.660485 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:44:45.660466587 +0000 UTC m=+1016.681582478 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:38 crc kubenswrapper[4790]: I0313 20:44:38.143822 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:38 crc kubenswrapper[4790]: I0313 20:44:38.202713 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:40 crc kubenswrapper[4790]: I0313 20:44:40.111419 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jgrz9" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" containerID="cri-o://09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24" gracePeriod=2 Mar 13 20:44:41 crc kubenswrapper[4790]: I0313 20:44:41.120710 4790 generic.go:334] "Generic (PLEG): container finished" podID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerID="09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24" exitCode=0 Mar 13 20:44:41 crc kubenswrapper[4790]: I0313 20:44:41.120805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24"} Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.015249 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.015593 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.015641 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.016232 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:44:44 crc kubenswrapper[4790]: I0313 20:44:44.016284 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf" gracePeriod=600 Mar 13 20:44:44 crc kubenswrapper[4790]: E0313 20:44:44.897512 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60" Mar 13 20:44:44 crc kubenswrapper[4790]: E0313 20:44:44.897694 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krblr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-5964f64c48-tzx96_openstack-operators(e154cc44-2769-4bfe-b8ef-3f6c56f08f74): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:44 crc kubenswrapper[4790]: E0313 20:44:44.898862 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" podUID="e154cc44-2769-4bfe-b8ef-3f6c56f08f74" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.076632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.100827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7caf7136-8a46-410b-8a32-72ab19e8baca-cert\") pod \"infra-operator-controller-manager-54dc5b8f8d-jrr7h\" (UID: \"7caf7136-8a46-410b-8a32-72ab19e8baca\") " pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.106511 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.150543 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf" exitCode=0 Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.150805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf"} Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.150879 4790 scope.go:117] "RemoveContainer" containerID="79e02ea9be9e1c9905df96f4d2c3972a24c6d7bee0d427327ce884018a382f4c" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.152135 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:a3bc074ddd9a26d3a8609e5dbdfa85a78449ba1c9b5542bff9949219d6760e60\\\"\"" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" podUID="e154cc44-2769-4bfe-b8ef-3f6c56f08f74" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.381719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.385500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5622f52e-2e94-41ca-a9d2-a0c833895937-cert\") pod \"openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn\" (UID: \"5622f52e-2e94-41ca-a9d2-a0c833895937\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.395106 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.530433 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.530602 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7jbvd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-776c5696bf-dxntp_openstack-operators(499aa973-6f5e-4229-9282-52c4fbf0625f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.538180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" podUID="499aa973-6f5e-4229-9282-52c4fbf0625f" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.559471 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.685552 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") pod \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.685759 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") pod \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.685842 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") pod \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\" (UID: \"8883fcbc-75ff-43e3-8088-f2ba848e9d3a\") " Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.686105 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.686148 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686292 4790 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686352 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:45:01.6863344 +0000 UTC m=+1032.707450291 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "metrics-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686291 4790 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: E0313 20:44:45.686453 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs podName:bf0c2c50-711c-4fbd-8c15-64bf6fc3572b nodeName:}" failed. No retries permitted until 2026-03-13 20:45:01.686435003 +0000 UTC m=+1032.707550894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs") pod "openstack-operator-controller-manager-5698bc49b8-xpzcd" (UID: "bf0c2c50-711c-4fbd-8c15-64bf6fc3572b") : secret "webhook-server-cert" not found Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.687011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities" (OuterVolumeSpecName: "utilities") pod "8883fcbc-75ff-43e3-8088-f2ba848e9d3a" (UID: "8883fcbc-75ff-43e3-8088-f2ba848e9d3a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.694034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n" (OuterVolumeSpecName: "kube-api-access-gf67n") pod "8883fcbc-75ff-43e3-8088-f2ba848e9d3a" (UID: "8883fcbc-75ff-43e3-8088-f2ba848e9d3a"). InnerVolumeSpecName "kube-api-access-gf67n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.736039 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8883fcbc-75ff-43e3-8088-f2ba848e9d3a" (UID: "8883fcbc-75ff-43e3-8088-f2ba848e9d3a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.787250 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.787286 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf67n\" (UniqueName: \"kubernetes.io/projected/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-kube-api-access-gf67n\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:45 crc kubenswrapper[4790]: I0313 20:44:45.787299 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8883fcbc-75ff-43e3-8088-f2ba848e9d3a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.137149 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.137341 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xxdfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-66d56f6ff4-h7rc9_openstack-operators(46fb44a5-f567-4f58-80b1-dd70694f9339): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.138570 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" podUID="46fb44a5-f567-4f58-80b1-dd70694f9339" Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.157264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jgrz9" event={"ID":"8883fcbc-75ff-43e3-8088-f2ba848e9d3a","Type":"ContainerDied","Data":"94c84ec1662023adbd79b891587ec02bac606782a1b69fbe98e2395146aadf04"} Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.157287 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jgrz9" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.158780 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:65d0c97340f72a8b23f8e11f4b3efcc6ad37daad9b88e24d4564383a08fa85f7\\\"\"" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" podUID="46fb44a5-f567-4f58-80b1-dd70694f9339" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.159597 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:5fe5351a3de5e1267112d52cd81477a01d47f90be713cc5439c76543a4c33721\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" podUID="499aa973-6f5e-4229-9282-52c4fbf0625f" Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.204141 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:46 crc kubenswrapper[4790]: I0313 20:44:46.209886 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jgrz9"] Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.720903 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.721116 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7sznf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-d47688694-s8p67_openstack-operators(bdbe5269-1150-4269-bc28-1d719f1b77b6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:46 crc kubenswrapper[4790]: E0313 20:44:46.722521 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" podUID="bdbe5269-1150-4269-bc28-1d719f1b77b6" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.208217 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:47dae162826e2e457bdc34f6dfebcf8f7d56e189fdbeba2e0118991a420a4165\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" podUID="bdbe5269-1150-4269-bc28-1d719f1b77b6" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.362466 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.362646 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wnq54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v_openstack-operators(5befe4e4-4574-42ac-90ce-ac67c1e33eee): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.364008 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" podUID="5befe4e4-4574-42ac-90ce-ac67c1e33eee" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.673561 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" path="/var/lib/kubelet/pods/8883fcbc-75ff-43e3-8088-f2ba848e9d3a/volumes" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807051 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.807804 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-content" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807828 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-content" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.807900 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807912 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" Mar 13 20:44:47 crc kubenswrapper[4790]: E0313 20:44:47.807927 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-utilities" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.807936 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="extract-utilities" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.808236 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8883fcbc-75ff-43e3-8088-f2ba848e9d3a" containerName="registry-server" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.809455 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.816514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.932674 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.932757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:47 crc kubenswrapper[4790]: I0313 20:44:47.932855 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034260 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034581 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.034899 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.051556 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"community-operators-xjs8f\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: I0313 20:44:48.131569 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.200125 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:a26d062af19b3bc6dc6633171f1eff8eec33e8e925465d4968a0b9a36012a7e7\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" podUID="5befe4e4-4574-42ac-90ce-ac67c1e33eee" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.526250 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.526425 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sbqlp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-684f77d66d-5vcsg_openstack-operators(77f24ce6-bc52-4831-902c-255983a8f911): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:48 crc kubenswrapper[4790]: E0313 20:44:48.527727 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" podUID="77f24ce6-bc52-4831-902c-255983a8f911" Mar 13 20:44:49 crc kubenswrapper[4790]: E0313 20:44:49.206651 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:40b84319f2f12a1c7ee478fd86a8b1aa5ac2ea8e24f5ce0f1ca78ad879dea8ca\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" podUID="77f24ce6-bc52-4831-902c-255983a8f911" Mar 13 20:44:50 crc kubenswrapper[4790]: I0313 20:44:50.952277 4790 scope.go:117] "RemoveContainer" containerID="09c550befb39cbb65b9bb00cebc6593f5a02e3c2c1137a099257ae9d32351e24" Mar 13 20:44:53 crc kubenswrapper[4790]: E0313 20:44:53.774804 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff" Mar 13 20:44:53 crc kubenswrapper[4790]: E0313 20:44:53.775573 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cpxs5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-7f84474648-b8lpj_openstack-operators(386f7e46-c2e3-4eae-aa82-05075883c889): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:53 crc kubenswrapper[4790]: E0313 20:44:53.777256 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" podUID="386f7e46-c2e3-4eae-aa82-05075883c889" Mar 13 20:44:54 crc kubenswrapper[4790]: E0313 20:44:54.250827 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:bbe772fa051f782c9dcc3c34ce43495e1116aa9089a760c10068790baa9b25ff\\\"\"" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" podUID="386f7e46-c2e3-4eae-aa82-05075883c889" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.101298 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.101492 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-j87jp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-xvrl9_openstack-operators(22e6d110-bd87-4d28-851d-307b4223ee8f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.103589 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" podUID="22e6d110-bd87-4d28-851d-307b4223ee8f" Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.162275 4790 scope.go:117] "RemoveContainer" containerID="e192663f06dfb187428edb1e170aca9856113025c267275042a42fcf172697f7" Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.234916 4790 scope.go:117] "RemoveContainer" containerID="13883d616d7859b8b1f4e3643b2470ceb4a60d0faba96109c31a1ecc31533caa" Mar 13 20:44:55 crc kubenswrapper[4790]: E0313 20:44:55.266025 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" podUID="22e6d110-bd87-4d28-851d-307b4223ee8f" Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.461147 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn"] Mar 13 20:44:55 crc kubenswrapper[4790]: W0313 20:44:55.466642 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5622f52e_2e94_41ca_a9d2_a0c833895937.slice/crio-cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25 WatchSource:0}: Error finding container cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25: Status 404 returned error can't find the container with id cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25 Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.573740 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h"] Mar 13 20:44:55 crc kubenswrapper[4790]: I0313 20:44:55.704697 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.272320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.274605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" event={"ID":"5622f52e-2e94-41ca-a9d2-a0c833895937","Type":"ContainerStarted","Data":"cd529476d6037583eee6c696b1d2ac3ea85e978bad8da5c7f5a9e1bea8fefb25"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.281487 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" event={"ID":"b5a018c4-3e3a-4f77-a272-20c94a5b9c7a","Type":"ContainerStarted","Data":"10bdaf4d26e7e5fe23d771ead6638f0b751f8f939d42553bd683fa795643fc52"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.281613 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.299623 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" event={"ID":"b36f993b-25cd-4f12-bf48-77bf6f4cf26b","Type":"ContainerStarted","Data":"8ab77fc960301017ff2965634df34952f9217a4c67bd87ec0a32f41213570074"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.300094 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.337057 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerID="fb40f1dd553eeada38a0eebd58e1e4a584d8b04e5e145194a1581e6f7877058c" exitCode=0 Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.337128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"fb40f1dd553eeada38a0eebd58e1e4a584d8b04e5e145194a1581e6f7877058c"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.337153 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerStarted","Data":"2c5f243728839b68f10d728f37eb16ef3d5e7896648c916636341303be93e6ac"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.344774 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" event={"ID":"a36ba835-deb4-41f5-9b6a-57d1e577c8b1","Type":"ContainerStarted","Data":"163b0e72972fc5ab793eec150a56f0e470a1cfcd2f192f73d3752fdace53e20a"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.346013 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.366867 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" event={"ID":"403c2990-8871-47da-abd8-8c9fc5753d54","Type":"ContainerStarted","Data":"2059239a14ccd1b0820091782feb06972de1bdde603a4ed32224a99486a598ac"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.367620 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.383272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" event={"ID":"a7488d00-50bc-4ce8-ae0a-8d3ff807c0da","Type":"ContainerStarted","Data":"38862ca9cc80e3d7be5dd86265607af3e9603c6af6bfa840e8077bc0e61a6f76"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.383904 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.403507 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" event={"ID":"0244e4ae-2ccd-482a-b490-58a8e46ab53d","Type":"ContainerStarted","Data":"1527737f5112aff2b5a9800a20cb1658d3bcc389c6c26a39fc763408541f7ab1"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.404077 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.409108 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" event={"ID":"dd8df218-c492-4e48-93a9-f5f2dbf7fc00","Type":"ContainerStarted","Data":"46b6eb30288ceb9db61be39bd00d47823550dd3ec2e1610ed5dbb25d29e279ed"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.409834 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.411123 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" event={"ID":"47bdfeda-c97a-40b5-82f8-1008ba20e75b","Type":"ContainerStarted","Data":"b9100d0a09419790d3788fe172d9b1e725ba5b7571b08ce9361b2a7a344df5fd"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.411619 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.412939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" event={"ID":"2747d064-d45f-4a4e-87c2-d2c9f82eac10","Type":"ContainerStarted","Data":"4b4149598dab0644f4b7235becb2e775cb2d1423853250057db34f1dbc9605e3"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.413466 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.414650 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" event={"ID":"2032df10-91a5-4a88-9705-c355f50a5024","Type":"ContainerStarted","Data":"54231c927763897d95b75b93d1914fe37cf1bc7bb82c53c3e52083ee1929070e"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.415108 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.432356 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" podStartSLOduration=3.369006536 podStartE2EDuration="27.432337337s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.154504178 +0000 UTC m=+1002.175620069" lastFinishedPulling="2026-03-13 20:44:55.217834979 +0000 UTC m=+1026.238950870" observedRunningTime="2026-03-13 20:44:56.400792621 +0000 UTC m=+1027.421908522" watchObservedRunningTime="2026-03-13 20:44:56.432337337 +0000 UTC m=+1027.453453228" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.449834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" event={"ID":"460b6997-f558-4e5f-9e15-aa33fece4f4b","Type":"ContainerStarted","Data":"e0f4fd3d12b0110991e073cf73d1b83026cd9e0942ddaa63adbc72fd664a0ed0"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.450585 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.466030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" event={"ID":"7caf7136-8a46-410b-8a32-72ab19e8baca","Type":"ContainerStarted","Data":"7fff8705da681634b83077f3798d4c96c7f4199628e55122f1056dbc03d811fc"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.489669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" event={"ID":"b1273818-139a-4213-b23c-609a7305c92f","Type":"ContainerStarted","Data":"5b405895cbc2593f65b40d1a9e1a3c335cd77a812d3f01402ff4c60e9e14af0c"} Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.490453 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.528561 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" podStartSLOduration=7.109120341 podStartE2EDuration="27.528540767s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.521870155 +0000 UTC m=+1001.542986046" lastFinishedPulling="2026-03-13 20:44:50.941290581 +0000 UTC m=+1021.962406472" observedRunningTime="2026-03-13 20:44:56.479465185 +0000 UTC m=+1027.500581076" watchObservedRunningTime="2026-03-13 20:44:56.528540767 +0000 UTC m=+1027.549656648" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.537960 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" podStartSLOduration=6.838166571 podStartE2EDuration="27.537942272s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.241562861 +0000 UTC m=+1001.262678752" lastFinishedPulling="2026-03-13 20:44:50.941338512 +0000 UTC m=+1021.962454453" observedRunningTime="2026-03-13 20:44:56.523955522 +0000 UTC m=+1027.545071413" watchObservedRunningTime="2026-03-13 20:44:56.537942272 +0000 UTC m=+1027.559058163" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.604869 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" podStartSLOduration=7.71249584 podStartE2EDuration="27.604849267s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.038957124 +0000 UTC m=+1002.060073015" lastFinishedPulling="2026-03-13 20:44:50.931310501 +0000 UTC m=+1021.952426442" observedRunningTime="2026-03-13 20:44:56.548592781 +0000 UTC m=+1027.569708672" watchObservedRunningTime="2026-03-13 20:44:56.604849267 +0000 UTC m=+1027.625965148" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.610492 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" podStartSLOduration=3.54642724 podStartE2EDuration="27.61047782s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.15457488 +0000 UTC m=+1002.175690781" lastFinishedPulling="2026-03-13 20:44:55.21862547 +0000 UTC m=+1026.239741361" observedRunningTime="2026-03-13 20:44:56.595885194 +0000 UTC m=+1027.617001085" watchObservedRunningTime="2026-03-13 20:44:56.61047782 +0000 UTC m=+1027.631593701" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.624110 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" podStartSLOduration=6.720485607 podStartE2EDuration="27.624096209s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.027627137 +0000 UTC m=+1001.048743028" lastFinishedPulling="2026-03-13 20:44:50.931237739 +0000 UTC m=+1021.952353630" observedRunningTime="2026-03-13 20:44:56.623712239 +0000 UTC m=+1027.644828120" watchObservedRunningTime="2026-03-13 20:44:56.624096209 +0000 UTC m=+1027.645212100" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.656388 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" podStartSLOduration=8.212864364 podStartE2EDuration="27.656363094s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.257512474 +0000 UTC m=+1001.278628365" lastFinishedPulling="2026-03-13 20:44:49.701011204 +0000 UTC m=+1020.722127095" observedRunningTime="2026-03-13 20:44:56.652750637 +0000 UTC m=+1027.673866528" watchObservedRunningTime="2026-03-13 20:44:56.656363094 +0000 UTC m=+1027.677478975" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.690069 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" podStartSLOduration=8.408657096 podStartE2EDuration="28.690052478s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.651147662 +0000 UTC m=+1001.672263553" lastFinishedPulling="2026-03-13 20:44:50.932543044 +0000 UTC m=+1021.953658935" observedRunningTime="2026-03-13 20:44:56.684155488 +0000 UTC m=+1027.705271379" watchObservedRunningTime="2026-03-13 20:44:56.690052478 +0000 UTC m=+1027.711168369" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.726172 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" podStartSLOduration=7.891147477 podStartE2EDuration="27.726154308s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.133410326 +0000 UTC m=+1002.154526217" lastFinishedPulling="2026-03-13 20:44:50.968417157 +0000 UTC m=+1021.989533048" observedRunningTime="2026-03-13 20:44:56.717622676 +0000 UTC m=+1027.738738567" watchObservedRunningTime="2026-03-13 20:44:56.726154308 +0000 UTC m=+1027.747270199" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.785664 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" podStartSLOduration=3.750913857 podStartE2EDuration="27.785646202s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.183036642 +0000 UTC m=+1002.204152543" lastFinishedPulling="2026-03-13 20:44:55.217768997 +0000 UTC m=+1026.238884888" observedRunningTime="2026-03-13 20:44:56.754329002 +0000 UTC m=+1027.775444893" watchObservedRunningTime="2026-03-13 20:44:56.785646202 +0000 UTC m=+1027.806762093" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.805295 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" podStartSLOduration=3.769389989 podStartE2EDuration="27.805268775s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.175682633 +0000 UTC m=+1002.196798524" lastFinishedPulling="2026-03-13 20:44:55.211561419 +0000 UTC m=+1026.232677310" observedRunningTime="2026-03-13 20:44:56.804838852 +0000 UTC m=+1027.825954753" watchObservedRunningTime="2026-03-13 20:44:56.805268775 +0000 UTC m=+1027.826384666" Mar 13 20:44:56 crc kubenswrapper[4790]: I0313 20:44:56.834077 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" podStartSLOduration=3.753683743 podStartE2EDuration="27.834060966s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.137431185 +0000 UTC m=+1002.158547076" lastFinishedPulling="2026-03-13 20:44:55.217808408 +0000 UTC m=+1026.238924299" observedRunningTime="2026-03-13 20:44:56.8320068 +0000 UTC m=+1027.853122691" watchObservedRunningTime="2026-03-13 20:44:56.834060966 +0000 UTC m=+1027.855176857" Mar 13 20:44:57 crc kubenswrapper[4790]: I0313 20:44:57.502528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerStarted","Data":"5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f"} Mar 13 20:44:58 crc kubenswrapper[4790]: I0313 20:44:58.514324 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerID="5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f" exitCode=0 Mar 13 20:44:58 crc kubenswrapper[4790]: I0313 20:44:58.514475 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f"} Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.027662 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-6854b8b9d9-f8l4s" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.152875 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.153763 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.158188 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.158974 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.165128 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.174637 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-5689f" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.239402 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.239455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.239498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.340824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.340878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.340912 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.341733 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.352257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.357053 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-cfb9g" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.360862 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"collect-profiles-29557245-5vhkw\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:00 crc kubenswrapper[4790]: I0313 20:45:00.481011 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.764889 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.766003 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.776597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-webhook-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.776949 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/bf0c2c50-711c-4fbd-8c15-64bf6fc3572b-metrics-certs\") pod \"openstack-operator-controller-manager-5698bc49b8-xpzcd\" (UID: \"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b\") " pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:01 crc kubenswrapper[4790]: I0313 20:45:01.993713 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-hzb9g" Mar 13 20:45:02 crc kubenswrapper[4790]: I0313 20:45:02.002463 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:02 crc kubenswrapper[4790]: I0313 20:45:02.651567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd"] Mar 13 20:45:02 crc kubenswrapper[4790]: W0313 20:45:02.662212 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbf0c2c50_711c_4fbd_8c15_64bf6fc3572b.slice/crio-f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b WatchSource:0}: Error finding container f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b: Status 404 returned error can't find the container with id f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b Mar 13 20:45:02 crc kubenswrapper[4790]: I0313 20:45:02.711192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.565963 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" event={"ID":"5622f52e-2e94-41ca-a9d2-a0c833895937","Type":"ContainerStarted","Data":"b532c8d4ff76521cedacf0b6e724309f963fdac4a36da5ec87fe020e3821c9c6"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.566464 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.567722 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" event={"ID":"5befe4e4-4574-42ac-90ce-ac67c1e33eee","Type":"ContainerStarted","Data":"d150027c0431ef4b5a35c47118d66fb1a18dbe46a37355f6a03f139f63ee375a"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.568425 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.570043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" event={"ID":"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b","Type":"ContainerStarted","Data":"32f741f9d4ed079d11ada5c34c9ea9b182a33477d0b60355d0c392292d2cdc9f"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.570088 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" event={"ID":"bf0c2c50-711c-4fbd-8c15-64bf6fc3572b","Type":"ContainerStarted","Data":"f4d360b2a9b9ae95cabe0c4be4d4d79a8eec64b200f19f4e158d18d5b681c25b"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.570522 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.573772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerStarted","Data":"364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.575639 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" event={"ID":"bdbe5269-1150-4269-bc28-1d719f1b77b6","Type":"ContainerStarted","Data":"f33338f374d5f656ccba1ad365e80d3c837dbfe86e4c0d15bb451944cd12ab6f"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.576115 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.577416 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" event={"ID":"46fb44a5-f567-4f58-80b1-dd70694f9339","Type":"ContainerStarted","Data":"6da7077e28e4ea6eb90d9b1e96e4e21fed0b4f0ce53b7f1a279b46cf64d7d1d4"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.577749 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.579030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" event={"ID":"7caf7136-8a46-410b-8a32-72ab19e8baca","Type":"ContainerStarted","Data":"583c54558feaa4a25a7ffdd55dd6d4247e8e29c1a64aeb290f75da651694992d"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.579469 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.580808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerStarted","Data":"0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.580832 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerStarted","Data":"a5074ae610244688560a689dbf0c1ccfe6efc4d574e04194d1a7f6904e949e82"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.582480 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" event={"ID":"e154cc44-2769-4bfe-b8ef-3f6c56f08f74","Type":"ContainerStarted","Data":"8e0185e92d472f4a6d7bda7359a2cb786124f86991ab9827c4dff79bed91a701"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.582869 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.584050 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" event={"ID":"499aa973-6f5e-4229-9282-52c4fbf0625f","Type":"ContainerStarted","Data":"e9b8a3f9f47d71e1b7fcb34ee4fb4e790a0198bf9f7e737d3a875923a1ecab26"} Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.584454 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.633780 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xjs8f" podStartSLOduration=10.5920972 podStartE2EDuration="16.633756743s" podCreationTimestamp="2026-03-13 20:44:47 +0000 UTC" firstStartedPulling="2026-03-13 20:44:56.340908037 +0000 UTC m=+1027.362023928" lastFinishedPulling="2026-03-13 20:45:02.38256758 +0000 UTC m=+1033.403683471" observedRunningTime="2026-03-13 20:45:03.630845114 +0000 UTC m=+1034.651961005" watchObservedRunningTime="2026-03-13 20:45:03.633756743 +0000 UTC m=+1034.654872634" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.636760 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" podStartSLOduration=27.823493318 podStartE2EDuration="34.636749234s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:55.471992554 +0000 UTC m=+1026.493108445" lastFinishedPulling="2026-03-13 20:45:02.28524847 +0000 UTC m=+1033.306364361" observedRunningTime="2026-03-13 20:45:03.607625945 +0000 UTC m=+1034.628741856" watchObservedRunningTime="2026-03-13 20:45:03.636749234 +0000 UTC m=+1034.657865125" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.646539 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" podStartSLOduration=3.64652531 podStartE2EDuration="3.64652531s" podCreationTimestamp="2026-03-13 20:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:45:03.643514999 +0000 UTC m=+1034.664630910" watchObservedRunningTime="2026-03-13 20:45:03.64652531 +0000 UTC m=+1034.667641201" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.674756 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" podStartSLOduration=4.247212843 podStartE2EDuration="35.674738915s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.887759192 +0000 UTC m=+1001.908875083" lastFinishedPulling="2026-03-13 20:45:02.315285254 +0000 UTC m=+1033.336401155" observedRunningTime="2026-03-13 20:45:03.67156402 +0000 UTC m=+1034.692679901" watchObservedRunningTime="2026-03-13 20:45:03.674738915 +0000 UTC m=+1034.695854806" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.698399 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" podStartSLOduration=3.94746725 podStartE2EDuration="35.698317395s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.631003356 +0000 UTC m=+1001.652119247" lastFinishedPulling="2026-03-13 20:45:02.381853501 +0000 UTC m=+1033.402969392" observedRunningTime="2026-03-13 20:45:03.695678653 +0000 UTC m=+1034.716794564" watchObservedRunningTime="2026-03-13 20:45:03.698317395 +0000 UTC m=+1034.719433286" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.742479 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" podStartSLOduration=34.742462103 podStartE2EDuration="34.742462103s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:45:03.738203207 +0000 UTC m=+1034.759319118" watchObservedRunningTime="2026-03-13 20:45:03.742462103 +0000 UTC m=+1034.763577994" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.759541 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" podStartSLOduration=28.077902851 podStartE2EDuration="34.759526415s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:55.603522553 +0000 UTC m=+1026.624638444" lastFinishedPulling="2026-03-13 20:45:02.285146097 +0000 UTC m=+1033.306262008" observedRunningTime="2026-03-13 20:45:03.757159571 +0000 UTC m=+1034.778275462" watchObservedRunningTime="2026-03-13 20:45:03.759526415 +0000 UTC m=+1034.780642306" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.800014 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" podStartSLOduration=3.466220394 podStartE2EDuration="34.799990813s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.04804255 +0000 UTC m=+1002.069158441" lastFinishedPulling="2026-03-13 20:45:02.381812969 +0000 UTC m=+1033.402928860" observedRunningTime="2026-03-13 20:45:03.795895872 +0000 UTC m=+1034.817011763" watchObservedRunningTime="2026-03-13 20:45:03.799990813 +0000 UTC m=+1034.821106704" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.819655 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" podStartSLOduration=4.001046414 podStartE2EDuration="35.819636066s" podCreationTimestamp="2026-03-13 20:44:28 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.562701933 +0000 UTC m=+1001.583817824" lastFinishedPulling="2026-03-13 20:45:02.381291585 +0000 UTC m=+1033.402407476" observedRunningTime="2026-03-13 20:45:03.816306216 +0000 UTC m=+1034.837422117" watchObservedRunningTime="2026-03-13 20:45:03.819636066 +0000 UTC m=+1034.840751957" Mar 13 20:45:03 crc kubenswrapper[4790]: I0313 20:45:03.855947 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" podStartSLOduration=3.361725488 podStartE2EDuration="34.855923s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.887441773 +0000 UTC m=+1001.908557664" lastFinishedPulling="2026-03-13 20:45:02.381639285 +0000 UTC m=+1033.402755176" observedRunningTime="2026-03-13 20:45:03.850973157 +0000 UTC m=+1034.872089058" watchObservedRunningTime="2026-03-13 20:45:03.855923 +0000 UTC m=+1034.877038891" Mar 13 20:45:04 crc kubenswrapper[4790]: I0313 20:45:04.591764 4790 generic.go:334] "Generic (PLEG): container finished" podID="0001db4d-b91a-473e-bfff-794d8663885f" containerID="0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3" exitCode=0 Mar 13 20:45:04 crc kubenswrapper[4790]: I0313 20:45:04.591855 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerDied","Data":"0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3"} Mar 13 20:45:05 crc kubenswrapper[4790]: I0313 20:45:05.908513 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.042857 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") pod \"0001db4d-b91a-473e-bfff-794d8663885f\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.042929 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") pod \"0001db4d-b91a-473e-bfff-794d8663885f\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.042966 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") pod \"0001db4d-b91a-473e-bfff-794d8663885f\" (UID: \"0001db4d-b91a-473e-bfff-794d8663885f\") " Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.043581 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume" (OuterVolumeSpecName: "config-volume") pod "0001db4d-b91a-473e-bfff-794d8663885f" (UID: "0001db4d-b91a-473e-bfff-794d8663885f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.047522 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "0001db4d-b91a-473e-bfff-794d8663885f" (UID: "0001db4d-b91a-473e-bfff-794d8663885f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.047597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9" (OuterVolumeSpecName: "kube-api-access-dm7l9") pod "0001db4d-b91a-473e-bfff-794d8663885f" (UID: "0001db4d-b91a-473e-bfff-794d8663885f"). InnerVolumeSpecName "kube-api-access-dm7l9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.144866 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/0001db4d-b91a-473e-bfff-794d8663885f-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.144906 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dm7l9\" (UniqueName: \"kubernetes.io/projected/0001db4d-b91a-473e-bfff-794d8663885f-kube-api-access-dm7l9\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.144917 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0001db4d-b91a-473e-bfff-794d8663885f-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.605730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" event={"ID":"0001db4d-b91a-473e-bfff-794d8663885f","Type":"ContainerDied","Data":"a5074ae610244688560a689dbf0c1ccfe6efc4d574e04194d1a7f6904e949e82"} Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.606022 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5074ae610244688560a689dbf0c1ccfe6efc4d574e04194d1a7f6904e949e82" Mar 13 20:45:06 crc kubenswrapper[4790]: I0313 20:45:06.606071 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.132741 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.133066 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.184581 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.652940 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:08 crc kubenswrapper[4790]: I0313 20:45:08.709462 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.425367 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-77b6666d85-q5nj7" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.467770 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-6d9d6b584d-nzdzx" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.505497 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-5bc894d9b-wfltj" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.547098 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-57b484b4df-hlk9s" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.604258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-d47688694-s8p67" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.605225 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.605625 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-984cd4dcf-5plwh" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.626000 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-66d56f6ff4-h7rc9" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.680355 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-776c5696bf-dxntp" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.712706 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-5964f64c48-tzx96" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.802745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5f4f55cb5c-tbbfl" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.869187 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-574d45c66c-c9lbv" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.878422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-bbc5b68f9-hwdv8" Mar 13 20:45:09 crc kubenswrapper[4790]: I0313 20:45:09.920901 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-7f9cc5dd44-ppzzz" Mar 13 20:45:10 crc kubenswrapper[4790]: I0313 20:45:10.630095 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xjs8f" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" containerID="cri-o://364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268" gracePeriod=2 Mar 13 20:45:12 crc kubenswrapper[4790]: I0313 20:45:12.007961 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5698bc49b8-xpzcd" Mar 13 20:45:12 crc kubenswrapper[4790]: I0313 20:45:12.658084 4790 generic.go:334] "Generic (PLEG): container finished" podID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerID="364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268" exitCode=0 Mar 13 20:45:12 crc kubenswrapper[4790]: I0313 20:45:12.658152 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268"} Mar 13 20:45:13 crc kubenswrapper[4790]: I0313 20:45:13.667631 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" event={"ID":"77f24ce6-bc52-4831-902c-255983a8f911","Type":"ContainerStarted","Data":"9ab2278e56369e3252dff8c8bfa590a2710e8d4ea2e820e8c4aa1f19459dc5c3"} Mar 13 20:45:13 crc kubenswrapper[4790]: I0313 20:45:13.667837 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:45:13 crc kubenswrapper[4790]: I0313 20:45:13.684575 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" podStartSLOduration=2.9125516 podStartE2EDuration="44.684548058s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:30.600959811 +0000 UTC m=+1001.622075702" lastFinishedPulling="2026-03-13 20:45:12.372956269 +0000 UTC m=+1043.394072160" observedRunningTime="2026-03-13 20:45:13.680453206 +0000 UTC m=+1044.701569097" watchObservedRunningTime="2026-03-13 20:45:13.684548058 +0000 UTC m=+1044.705663949" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.389360 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.459580 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") pod \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.459640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") pod \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.459814 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") pod \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\" (UID: \"f4f563dc-7ac2-4d78-96d4-55d27013fec4\") " Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.460951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities" (OuterVolumeSpecName: "utilities") pod "f4f563dc-7ac2-4d78-96d4-55d27013fec4" (UID: "f4f563dc-7ac2-4d78-96d4-55d27013fec4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.465274 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2" (OuterVolumeSpecName: "kube-api-access-d2gr2") pod "f4f563dc-7ac2-4d78-96d4-55d27013fec4" (UID: "f4f563dc-7ac2-4d78-96d4-55d27013fec4"). InnerVolumeSpecName "kube-api-access-d2gr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.516713 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f4f563dc-7ac2-4d78-96d4-55d27013fec4" (UID: "f4f563dc-7ac2-4d78-96d4-55d27013fec4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.561735 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2gr2\" (UniqueName: \"kubernetes.io/projected/f4f563dc-7ac2-4d78-96d4-55d27013fec4-kube-api-access-d2gr2\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.562109 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.562185 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f4f563dc-7ac2-4d78-96d4-55d27013fec4-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.674177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" event={"ID":"386f7e46-c2e3-4eae-aa82-05075883c889","Type":"ContainerStarted","Data":"ff240f9b939f7e4ac057b8763a167915e478dd487a4bedb6647509ae575e5640"} Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.675177 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.676611 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xjs8f" event={"ID":"f4f563dc-7ac2-4d78-96d4-55d27013fec4","Type":"ContainerDied","Data":"2c5f243728839b68f10d728f37eb16ef3d5e7896648c916636341303be93e6ac"} Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.676627 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xjs8f" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.676653 4790 scope.go:117] "RemoveContainer" containerID="364294eac139e93e008805a4dd45727e95be76e08a1a75d2522639679a070268" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.678388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" event={"ID":"22e6d110-bd87-4d28-851d-307b4223ee8f","Type":"ContainerStarted","Data":"d74da75256df8f2ddc2083a14d07763c7a2a17638b38e47beb2c9483cd7377b4"} Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.704576 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" podStartSLOduration=2.321262355 podStartE2EDuration="45.704551455s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.001290941 +0000 UTC m=+1002.022406852" lastFinishedPulling="2026-03-13 20:45:14.384580061 +0000 UTC m=+1045.405695952" observedRunningTime="2026-03-13 20:45:14.696676941 +0000 UTC m=+1045.717792832" watchObservedRunningTime="2026-03-13 20:45:14.704551455 +0000 UTC m=+1045.725667346" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.704666 4790 scope.go:117] "RemoveContainer" containerID="5fcf1e83068560d955d24f52c0498d142b90a5e9d5b8dc6a7099e4dc67703b1f" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.723459 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.730612 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xjs8f"] Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.732162 4790 scope.go:117] "RemoveContainer" containerID="fb40f1dd553eeada38a0eebd58e1e4a584d8b04e5e145194a1581e6f7877058c" Mar 13 20:45:14 crc kubenswrapper[4790]: I0313 20:45:14.735585 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-xvrl9" podStartSLOduration=2.6264852579999998 podStartE2EDuration="45.735546218s" podCreationTimestamp="2026-03-13 20:44:29 +0000 UTC" firstStartedPulling="2026-03-13 20:44:31.272566781 +0000 UTC m=+1002.293682672" lastFinishedPulling="2026-03-13 20:45:14.381627741 +0000 UTC m=+1045.402743632" observedRunningTime="2026-03-13 20:45:14.73011631 +0000 UTC m=+1045.751232201" watchObservedRunningTime="2026-03-13 20:45:14.735546218 +0000 UTC m=+1045.756662109" Mar 13 20:45:15 crc kubenswrapper[4790]: I0313 20:45:15.112817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-54dc5b8f8d-jrr7h" Mar 13 20:45:15 crc kubenswrapper[4790]: I0313 20:45:15.402250 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn" Mar 13 20:45:15 crc kubenswrapper[4790]: I0313 20:45:15.667815 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" path="/var/lib/kubelet/pods/f4f563dc-7ac2-4d78-96d4-55d27013fec4/volumes" Mar 13 20:45:19 crc kubenswrapper[4790]: I0313 20:45:19.529995 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-684f77d66d-5vcsg" Mar 13 20:45:19 crc kubenswrapper[4790]: I0313 20:45:19.753833 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-7f84474648-b8lpj" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.675931 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.679352 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0001db4d-b91a-473e-bfff-794d8663885f" containerName="collect-profiles" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.679571 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0001db4d-b91a-473e-bfff-794d8663885f" containerName="collect-profiles" Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.679656 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-utilities" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.679739 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-utilities" Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.679827 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-content" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.679900 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="extract-content" Mar 13 20:45:41 crc kubenswrapper[4790]: E0313 20:45:41.680007 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.680081 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.680314 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f563dc-7ac2-4d78-96d4-55d27013fec4" containerName="registry-server" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.680444 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0001db4d-b91a-473e-bfff-794d8663885f" containerName="collect-profiles" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.681428 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.688862 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-5lwx2" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.689830 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.690009 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.690173 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.711725 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.744005 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.745458 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.750672 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.756913 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.779270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.779334 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.880640 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.880712 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.880969 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.881111 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.881201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.882096 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.899073 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"dnsmasq-dns-675f4bcbfc-shfrx\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.982303 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.982363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.982420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.983872 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.984139 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:41 crc kubenswrapper[4790]: I0313 20:45:41.998344 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"dnsmasq-dns-78dd6ddcc-5s4r8\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.006702 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.066065 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.421829 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.521780 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.938350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" event={"ID":"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf","Type":"ContainerStarted","Data":"1e13e0bcda642d94f6b249dc823d2fd87698f812917e6d7b60f2ffc56fbe460d"} Mar 13 20:45:42 crc kubenswrapper[4790]: I0313 20:45:42.939763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" event={"ID":"61d662b4-cdc6-4d2f-a8a6-f71db4380caa","Type":"ContainerStarted","Data":"efbaf88a68a9782b2a6db13fe0d06640d640fb6b93b4efe14c9fed10e5292a92"} Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.598815 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.611962 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.613297 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.631271 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.738182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.738236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.738278 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.843968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.844024 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.844076 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.845156 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.845199 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.870575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"dnsmasq-dns-5ccc8479f9-c6rxs\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.882712 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.902244 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.903491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.919940 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:45:44 crc kubenswrapper[4790]: I0313 20:45:44.945677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.046791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.046851 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.046915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.149070 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.149355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.149422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.150584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.151124 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.183409 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"dnsmasq-dns-57d769cc4f-rkgwq\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.229259 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.461714 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:45:45 crc kubenswrapper[4790]: W0313 20:45:45.465263 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3603867e_b715_48af_b4d3_248f69035bf4.slice/crio-d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5 WatchSource:0}: Error finding container d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5: Status 404 returned error can't find the container with id d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5 Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.635963 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:45:45 crc kubenswrapper[4790]: W0313 20:45:45.639280 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb63dd900_9f63_4b6a_b620_bd1dfaa88cfe.slice/crio-137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844 WatchSource:0}: Error finding container 137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844: Status 404 returned error can't find the container with id 137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844 Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.777847 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.779816 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.781908 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.781917 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.783134 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.783288 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.784272 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6fg95" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.786018 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.786132 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.786262 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964699 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964725 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.964770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965242 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965267 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965298 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965531 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.965562 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.982901 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerStarted","Data":"137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844"} Mar 13 20:45:45 crc kubenswrapper[4790]: I0313 20:45:45.985041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerStarted","Data":"d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5"} Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.051651 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.054298 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057217 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057344 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bssvd" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057421 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057574 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.057968 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.072538 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.073071 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077616 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077697 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.077967 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.078045 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.083662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.083740 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.084744 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.086043 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.087510 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.087844 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.088238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.088313 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.093121 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.093717 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.106631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.117021 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.162829 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185605 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185668 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185689 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185709 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185730 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185773 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.185867 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288785 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288877 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.288948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289033 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289095 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289120 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289171 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.289992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.290787 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.290890 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.291035 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.291097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.291127 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.295140 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.295679 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.295792 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.298186 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.307414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.312517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " pod="openstack/rabbitmq-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.406621 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:45:46 crc kubenswrapper[4790]: I0313 20:45:46.518731 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.256993 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.259068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.261640 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.262033 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.263090 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lhxlc" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.263947 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.268464 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.273072 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409788 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-default\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409845 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409873 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.409915 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410184 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kolla-config\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.410350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qp6k\" (UniqueName: \"kubernetes.io/projected/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kube-api-access-4qp6k\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-default\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511603 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511632 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511655 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.511755 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kolla-config\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.512000 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.512340 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-generated\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.512933 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-config-data-default\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.513000 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qp6k\" (UniqueName: \"kubernetes.io/projected/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kube-api-access-4qp6k\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.513025 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kolla-config\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.513272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fceb0829-5f0e-4e78-a803-61afc5aa4d60-operator-scripts\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.517497 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.517554 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fceb0829-5f0e-4e78-a803-61afc5aa4d60-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.537478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qp6k\" (UniqueName: \"kubernetes.io/projected/fceb0829-5f0e-4e78-a803-61afc5aa4d60-kube-api-access-4qp6k\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.541686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"fceb0829-5f0e-4e78-a803-61afc5aa4d60\") " pod="openstack/openstack-galera-0" Mar 13 20:45:47 crc kubenswrapper[4790]: I0313 20:45:47.579674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.604486 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.606131 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.609132 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.609571 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.609682 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.610081 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-dqbxb" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.612796 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.728670 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.728969 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729048 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729086 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa2face0-9349-4482-880a-b23cf41099b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729119 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729333 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-466bt\" (UniqueName: \"kubernetes.io/projected/fa2face0-9349-4482-880a-b23cf41099b2-kube-api-access-466bt\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.729418 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830674 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830727 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa2face0-9349-4482-880a-b23cf41099b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830810 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.830829 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-466bt\" (UniqueName: \"kubernetes.io/projected/fa2face0-9349-4482-880a-b23cf41099b2-kube-api-access-466bt\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.831779 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.832397 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.832697 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.832705 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/fa2face0-9349-4482-880a-b23cf41099b2-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.833316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa2face0-9349-4482-880a-b23cf41099b2-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.849155 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.849762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa2face0-9349-4482-880a-b23cf41099b2-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.851812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-466bt\" (UniqueName: \"kubernetes.io/projected/fa2face0-9349-4482-880a-b23cf41099b2-kube-api-access-466bt\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.852518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"openstack-cell1-galera-0\" (UID: \"fa2face0-9349-4482-880a-b23cf41099b2\") " pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.876183 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.877467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.879472 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.879698 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-jhfzc" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.879828 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.889960 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.938288 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.939880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kube-api-access-94gv9\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940028 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940289 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940454 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kolla-config\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:48 crc kubenswrapper[4790]: I0313 20:45:48.940520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-config-data\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kolla-config\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-config-data\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042346 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kube-api-access-94gv9\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042399 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.042430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.043122 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kolla-config\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.043464 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/3980f8da-ddaa-4634-8c09-1a71ae19c58f-config-data\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.045601 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-combined-ca-bundle\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.049852 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/3980f8da-ddaa-4634-8c09-1a71ae19c58f-memcached-tls-certs\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.064059 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94gv9\" (UniqueName: \"kubernetes.io/projected/3980f8da-ddaa-4634-8c09-1a71ae19c58f-kube-api-access-94gv9\") pod \"memcached-0\" (UID: \"3980f8da-ddaa-4634-8c09-1a71ae19c58f\") " pod="openstack/memcached-0" Mar 13 20:45:49 crc kubenswrapper[4790]: I0313 20:45:49.246698 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.076974 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.078546 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.080987 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-hthr2" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.091117 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.171351 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"kube-state-metrics-0\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.272213 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"kube-state-metrics-0\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.299329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"kube-state-metrics-0\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " pod="openstack/kube-state-metrics-0" Mar 13 20:45:51 crc kubenswrapper[4790]: I0313 20:45:51.407444 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.241412 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vspq5"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.242762 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.248444 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.248500 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-2xbsh" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.248685 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.262090 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.320714 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-k7bzr"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.322758 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.329388 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7bzr"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334593 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c72ac557-7882-4120-b64a-4343639cc766-scripts\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334618 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-combined-ca-bundle\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334639 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5gzg\" (UniqueName: \"kubernetes.io/projected/c72ac557-7882-4120-b64a-4343639cc766-kube-api-access-s5gzg\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334743 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-ovn-controller-tls-certs\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.334761 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-log-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438330 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-lib\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-log\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlmbq\" (UniqueName: \"kubernetes.io/projected/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-kube-api-access-nlmbq\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438477 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5gzg\" (UniqueName: \"kubernetes.io/projected/c72ac557-7882-4120-b64a-4343639cc766-kube-api-access-s5gzg\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438515 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-scripts\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-etc-ovs\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438564 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-ovn-controller-tls-certs\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438588 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-log-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438614 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.438645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c72ac557-7882-4120-b64a-4343639cc766-scripts\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.439183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.439943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-run\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.439977 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-combined-ca-bundle\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.440609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-run-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.440794 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c72ac557-7882-4120-b64a-4343639cc766-var-log-ovn\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.442982 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c72ac557-7882-4120-b64a-4343639cc766-scripts\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.450135 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-ovn-controller-tls-certs\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.450749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c72ac557-7882-4120-b64a-4343639cc766-combined-ca-bundle\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.459475 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5gzg\" (UniqueName: \"kubernetes.io/projected/c72ac557-7882-4120-b64a-4343639cc766-kube-api-access-s5gzg\") pod \"ovn-controller-vspq5\" (UID: \"c72ac557-7882-4120-b64a-4343639cc766\") " pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.523867 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.524971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.532445 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.532801 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.533399 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.533620 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.533764 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wqthz" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.541218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-scripts\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.541271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-etc-ovs\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.541966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-run\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.542055 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-lib\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.543564 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-log\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.543659 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlmbq\" (UniqueName: \"kubernetes.io/projected/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-kube-api-access-nlmbq\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544447 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-etc-ovs\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544502 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-run\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544602 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-lib\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.544683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-var-log\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.555759 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.559917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-scripts\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.562323 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlmbq\" (UniqueName: \"kubernetes.io/projected/8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58-kube-api-access-nlmbq\") pod \"ovn-controller-ovs-k7bzr\" (UID: \"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58\") " pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.562713 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.637046 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.648758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.648945 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rgdxh\" (UniqueName: \"kubernetes.io/projected/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-kube-api-access-rgdxh\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649349 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649522 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-config\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.649716 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751623 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751671 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rgdxh\" (UniqueName: \"kubernetes.io/projected/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-kube-api-access-rgdxh\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751808 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751853 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-config\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.751881 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.756728 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.756986 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.758370 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.760080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.764063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.767918 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-config\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.772892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.777108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rgdxh\" (UniqueName: \"kubernetes.io/projected/f5a24d7e-902f-4862-9c6b-8317f8fb3f29-kube-api-access-rgdxh\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.785213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ovsdbserver-nb-0\" (UID: \"f5a24d7e-902f-4862-9c6b-8317f8fb3f29\") " pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:54 crc kubenswrapper[4790]: I0313 20:45:54.860937 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 13 20:45:57 crc kubenswrapper[4790]: I0313 20:45:57.678823 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.674652 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.710463 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.710667 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.714342 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.715112 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.715544 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-tcpfk" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.716120 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.716730 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834391 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-config\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834644 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.834841 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvg4t\" (UniqueName: \"kubernetes.io/projected/ba4867dc-70fb-4533-a075-31fc03f7ef33-kube-api-access-lvg4t\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.851992 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.861050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: W0313 20:45:58.872957 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfceb0829_5f0e_4e78_a803_61afc5aa4d60.slice/crio-46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9 WatchSource:0}: Error finding container 46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9: Status 404 returned error can't find the container with id 46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9 Mar 13 20:45:58 crc kubenswrapper[4790]: W0313 20:45:58.902505 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc575f482_56cd_4dfc_84c6_c6bb922d56a9.slice/crio-dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761 WatchSource:0}: Error finding container dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761: Status 404 returned error can't find the container with id dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761 Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.936787 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvg4t\" (UniqueName: \"kubernetes.io/projected/ba4867dc-70fb-4533-a075-31fc03f7ef33-kube-api-access-lvg4t\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937489 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-config\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.937602 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.938092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.938455 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.939476 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-config\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.939568 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ba4867dc-70fb-4533-a075-31fc03f7ef33-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.943423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.947165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.949420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba4867dc-70fb-4533-a075-31fc03f7ef33-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.961108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvg4t\" (UniqueName: \"kubernetes.io/projected/ba4867dc-70fb-4533-a075-31fc03f7ef33-kube-api-access-lvg4t\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.972081 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.985869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"ovsdbserver-sb-0\" (UID: \"ba4867dc-70fb-4533-a075-31fc03f7ef33\") " pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:58 crc kubenswrapper[4790]: I0313 20:45:58.992610 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5"] Mar 13 20:45:58 crc kubenswrapper[4790]: W0313 20:45:58.997854 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc72ac557_7882_4120_b64a_4343639cc766.slice/crio-873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca WatchSource:0}: Error finding container 873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca: Status 404 returned error can't find the container with id 873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.002180 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 13 20:45:59 crc kubenswrapper[4790]: W0313 20:45:59.007656 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3980f8da_ddaa_4634_8c09_1a71ae19c58f.slice/crio-b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673 WatchSource:0}: Error finding container b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673: Status 404 returned error can't find the container with id b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.076015 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 13 20:45:59 crc kubenswrapper[4790]: W0313 20:45:59.078923 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a24d7e_902f_4862_9c6b_8317f8fb3f29.slice/crio-438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c WatchSource:0}: Error finding container 438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c: Status 404 returned error can't find the container with id 438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.101420 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3980f8da-ddaa-4634-8c09-1a71ae19c58f","Type":"ContainerStarted","Data":"b119658e51cdaa8792d7650fa1e9769da237195e0e856928722f4699a4f3f673"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.110291 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerStarted","Data":"dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.123948 4790 generic.go:334] "Generic (PLEG): container finished" podID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerID="d5635f334bb1d0f55f8df6048568c51547f61cdf8fa854744c6f631fac79f9eb" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.124004 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" event={"ID":"61d662b4-cdc6-4d2f-a8a6-f71db4380caa","Type":"ContainerDied","Data":"d5635f334bb1d0f55f8df6048568c51547f61cdf8fa854744c6f631fac79f9eb"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.125966 4790 generic.go:334] "Generic (PLEG): container finished" podID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerID="28a84259682ac8b19ed7f572691d8c2369de14cf6cf51002c97c47560eb5ee72" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.126078 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" event={"ID":"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf","Type":"ContainerDied","Data":"28a84259682ac8b19ed7f572691d8c2369de14cf6cf51002c97c47560eb5ee72"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.128775 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerStarted","Data":"46cad1b617a70f8bcf113971d68047eede9646e0f3e9d5ff131aa856a2f3f0f9"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.130072 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f5a24d7e-902f-4862-9c6b-8317f8fb3f29","Type":"ContainerStarted","Data":"438fa1f81223640149aaf756dfb2e57cd8cba9c4e3612bceda6c41fc1b7f3a5c"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.131154 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerStarted","Data":"a3ba4dde9b3affbf2de80fd01b6004ec5bcc39b41c69eac7056b983bf5ce8c10"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.132103 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerStarted","Data":"9412b68e06c64733faf3ac7751a6a1b8d9727402ab2300755813801ae5bb6cae"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.133097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerStarted","Data":"6218f617d211db14656d09a088c6de02a6677348fa07bdf9d142d99af0111ad7"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.135068 4790 generic.go:334] "Generic (PLEG): container finished" podID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerID="8263c960933930e9418f327d1c70a8da265ccd7214d4f221c7150a432da81ec8" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.135125 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerDied","Data":"8263c960933930e9418f327d1c70a8da265ccd7214d4f221c7150a432da81ec8"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.137609 4790 generic.go:334] "Generic (PLEG): container finished" podID="3603867e-b715-48af-b4d3-248f69035bf4" containerID="49d625d0111656eb749d168f5c6aa08a6533bb845529b49927ec4ee997aab45d" exitCode=0 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.137833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerDied","Data":"49d625d0111656eb749d168f5c6aa08a6533bb845529b49927ec4ee997aab45d"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.150732 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5" event={"ID":"c72ac557-7882-4120-b64a-4343639cc766","Type":"ContainerStarted","Data":"873da259123ec8b3b0869fe0330ca134574dd19dc1fdd69b82ccb26bd1fd40ca"} Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.211396 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.274928 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-k7bzr"] Mar 13 20:45:59 crc kubenswrapper[4790]: W0313 20:45:59.308909 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c2d7175_fc2b_4492_ac1c_e2cc3dd44c58.slice/crio-9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168 WatchSource:0}: Error finding container 9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168: Status 404 returned error can't find the container with id 9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168 Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.567743 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.596406 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663122 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") pod \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663524 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") pod \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") pod \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663691 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") pod \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\" (UID: \"61d662b4-cdc6-4d2f-a8a6-f71db4380caa\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.663718 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") pod \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\" (UID: \"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf\") " Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.668300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb" (OuterVolumeSpecName: "kube-api-access-2d2bb") pod "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" (UID: "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf"). InnerVolumeSpecName "kube-api-access-2d2bb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.668647 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2" (OuterVolumeSpecName: "kube-api-access-bb6c2") pod "61d662b4-cdc6-4d2f-a8a6-f71db4380caa" (UID: "61d662b4-cdc6-4d2f-a8a6-f71db4380caa"). InnerVolumeSpecName "kube-api-access-bb6c2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.685838 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config" (OuterVolumeSpecName: "config") pod "61d662b4-cdc6-4d2f-a8a6-f71db4380caa" (UID: "61d662b4-cdc6-4d2f-a8a6-f71db4380caa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.692624 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" (UID: "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.705679 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config" (OuterVolumeSpecName: "config") pod "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" (UID: "ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.794618 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-nrv7g"] Mar 13 20:45:59 crc kubenswrapper[4790]: E0313 20:45:59.795054 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795071 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: E0313 20:45:59.795085 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795093 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795294 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.795308 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" containerName="init" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.796008 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.797957 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-combined-ca-bundle\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovn-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovs-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803274 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdn6h\" (UniqueName: \"kubernetes.io/projected/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-kube-api-access-cdn6h\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-config\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803429 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803445 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb6c2\" (UniqueName: \"kubernetes.io/projected/61d662b4-cdc6-4d2f-a8a6-f71db4380caa-kube-api-access-bb6c2\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803462 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d2bb\" (UniqueName: \"kubernetes.io/projected/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-kube-api-access-2d2bb\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803474 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.803488 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.806761 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nrv7g"] Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905144 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-config\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-combined-ca-bundle\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905252 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovn-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.906402 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-config\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.906479 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovn-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.905322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.908686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovs-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.908735 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdn6h\" (UniqueName: \"kubernetes.io/projected/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-kube-api-access-cdn6h\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.909341 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-ovs-rundir\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.909923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-combined-ca-bundle\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.910603 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:45:59 crc kubenswrapper[4790]: I0313 20:45:59.961615 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdn6h\" (UniqueName: \"kubernetes.io/projected/dfb0e0ca-d164-4e22-9d3f-055a45a372d2-kube-api-access-cdn6h\") pod \"ovn-controller-metrics-nrv7g\" (UID: \"dfb0e0ca-d164-4e22-9d3f-055a45a372d2\") " pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.057609 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.104779 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.106480 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.109125 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.115913 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.115976 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.116019 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.116076 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.118320 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.124693 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-nrv7g" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.160280 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.186752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerStarted","Data":"e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.187100 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.191438 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.192714 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerStarted","Data":"9af32c1da0d6123293f1baa28364e78968ac47013cc06dd734f41c791ba7c168"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.201318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" event={"ID":"ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf","Type":"ContainerDied","Data":"1e13e0bcda642d94f6b249dc823d2fd87698f812917e6d7b60f2ffc56fbe460d"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.201387 4790 scope.go:117] "RemoveContainer" containerID="28a84259682ac8b19ed7f572691d8c2369de14cf6cf51002c97c47560eb5ee72" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.201641 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-5s4r8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.213338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" event={"ID":"61d662b4-cdc6-4d2f-a8a6-f71db4380caa","Type":"ContainerDied","Data":"efbaf88a68a9782b2a6db13fe0d06640d640fb6b93b4efe14c9fed10e5292a92"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.213547 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-shfrx" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216440 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216501 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216582 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.216653 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.218593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.219013 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.219485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.219609 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.220567 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.221173 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerStarted","Data":"195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8"} Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.221832 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.226045 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.226068 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.226122 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.234734 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.239693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"dnsmasq-dns-7fd796d7df-wrjx8\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.271021 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.272421 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.278105 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.279852 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.280034 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" podStartSLOduration=3.37372235 podStartE2EDuration="16.280019664s" podCreationTimestamp="2026-03-13 20:45:44 +0000 UTC" firstStartedPulling="2026-03-13 20:45:45.468018889 +0000 UTC m=+1076.489134780" lastFinishedPulling="2026-03-13 20:45:58.374316203 +0000 UTC m=+1089.395432094" observedRunningTime="2026-03-13 20:46:00.212873227 +0000 UTC m=+1091.233989128" watchObservedRunningTime="2026-03-13 20:46:00.280019664 +0000 UTC m=+1091.301135555" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.294652 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" podStartSLOduration=3.59204327 podStartE2EDuration="16.294633661s" podCreationTimestamp="2026-03-13 20:45:44 +0000 UTC" firstStartedPulling="2026-03-13 20:45:45.641568361 +0000 UTC m=+1076.662684252" lastFinishedPulling="2026-03-13 20:45:58.344158752 +0000 UTC m=+1089.365274643" observedRunningTime="2026-03-13 20:46:00.246364838 +0000 UTC m=+1091.267480729" watchObservedRunningTime="2026-03-13 20:46:00.294633661 +0000 UTC m=+1091.315749552" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318810 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318958 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.318991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"auto-csr-approver-29557246-lrvrv\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.319037 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.359549 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.366232 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-5s4r8"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.379667 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.396475 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-shfrx"] Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419897 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419945 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419971 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"auto-csr-approver-29557246-lrvrv\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.419999 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.420035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.420053 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.420929 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.421167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.421676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.421925 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.435787 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"auto-csr-approver-29557246-lrvrv\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.438257 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"dnsmasq-dns-86db49b7ff-7w2fv\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.439869 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.660878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:00 crc kubenswrapper[4790]: I0313 20:46:00.671448 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:01 crc kubenswrapper[4790]: I0313 20:46:01.232948 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" containerID="cri-o://195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8" gracePeriod=10 Mar 13 20:46:01 crc kubenswrapper[4790]: I0313 20:46:01.679557 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61d662b4-cdc6-4d2f-a8a6-f71db4380caa" path="/var/lib/kubelet/pods/61d662b4-cdc6-4d2f-a8a6-f71db4380caa/volumes" Mar 13 20:46:01 crc kubenswrapper[4790]: I0313 20:46:01.680187 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf" path="/var/lib/kubelet/pods/ea90ef6a-4f93-4c68-9527-9cfaf1c75fcf/volumes" Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.245908 4790 generic.go:334] "Generic (PLEG): container finished" podID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerID="195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8" exitCode=0 Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.245980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerDied","Data":"195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8"} Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.246196 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" containerID="cri-o://e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882" gracePeriod=10 Mar 13 20:46:02 crc kubenswrapper[4790]: I0313 20:46:02.730390 4790 scope.go:117] "RemoveContainer" containerID="d5635f334bb1d0f55f8df6048568c51547f61cdf8fa854744c6f631fac79f9eb" Mar 13 20:46:03 crc kubenswrapper[4790]: I0313 20:46:03.264875 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ba4867dc-70fb-4533-a075-31fc03f7ef33","Type":"ContainerStarted","Data":"2dddf44d9b971a802eb68d20dd187687fce5545cfc8a47258dff11cb22b24e69"} Mar 13 20:46:03 crc kubenswrapper[4790]: I0313 20:46:03.267138 4790 generic.go:334] "Generic (PLEG): container finished" podID="3603867e-b715-48af-b4d3-248f69035bf4" containerID="e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882" exitCode=0 Mar 13 20:46:03 crc kubenswrapper[4790]: I0313 20:46:03.267187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerDied","Data":"e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882"} Mar 13 20:46:04 crc kubenswrapper[4790]: I0313 20:46:04.863871 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.022291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") pod \"3603867e-b715-48af-b4d3-248f69035bf4\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.022710 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") pod \"3603867e-b715-48af-b4d3-248f69035bf4\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.022762 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") pod \"3603867e-b715-48af-b4d3-248f69035bf4\" (UID: \"3603867e-b715-48af-b4d3-248f69035bf4\") " Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.032362 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4" (OuterVolumeSpecName: "kube-api-access-5zkr4") pod "3603867e-b715-48af-b4d3-248f69035bf4" (UID: "3603867e-b715-48af-b4d3-248f69035bf4"). InnerVolumeSpecName "kube-api-access-5zkr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.056666 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3603867e-b715-48af-b4d3-248f69035bf4" (UID: "3603867e-b715-48af-b4d3-248f69035bf4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.057513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config" (OuterVolumeSpecName: "config") pod "3603867e-b715-48af-b4d3-248f69035bf4" (UID: "3603867e-b715-48af-b4d3-248f69035bf4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.124876 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.124915 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3603867e-b715-48af-b4d3-248f69035bf4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.124927 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zkr4\" (UniqueName: \"kubernetes.io/projected/3603867e-b715-48af-b4d3-248f69035bf4-kube-api-access-5zkr4\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.291533 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" event={"ID":"3603867e-b715-48af-b4d3-248f69035bf4","Type":"ContainerDied","Data":"d0ca1d2819be476a413f05b3b41099985b59f5c5964c2c706ee5cb54a75ac1d5"} Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.291618 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-c6rxs" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.351314 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.358169 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-c6rxs"] Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.670742 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3603867e-b715-48af-b4d3-248f69035bf4" path="/var/lib/kubelet/pods/3603867e-b715-48af-b4d3-248f69035bf4/volumes" Mar 13 20:46:05 crc kubenswrapper[4790]: I0313 20:46:05.952005 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.041191 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") pod \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.041280 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") pod \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.041353 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") pod \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\" (UID: \"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe\") " Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.044457 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7" (OuterVolumeSpecName: "kube-api-access-v8ck7") pod "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" (UID: "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe"). InnerVolumeSpecName "kube-api-access-v8ck7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.084083 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" (UID: "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.089732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config" (OuterVolumeSpecName: "config") pod "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" (UID: "b63dd900-9f63-4b6a-b620-bd1dfaa88cfe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.144449 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v8ck7\" (UniqueName: \"kubernetes.io/projected/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-kube-api-access-v8ck7\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.144480 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.144489 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.300492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" event={"ID":"b63dd900-9f63-4b6a-b620-bd1dfaa88cfe","Type":"ContainerDied","Data":"137f1a12183cad427eac35ff712ec2bc7e38f51287a23e598f66e4d7a4466844"} Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.300557 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.333582 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:46:06 crc kubenswrapper[4790]: I0313 20:46:06.340872 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-rkgwq"] Mar 13 20:46:07 crc kubenswrapper[4790]: I0313 20:46:07.670191 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" path="/var/lib/kubelet/pods/b63dd900-9f63-4b6a-b620-bd1dfaa88cfe/volumes" Mar 13 20:46:07 crc kubenswrapper[4790]: I0313 20:46:07.864321 4790 scope.go:117] "RemoveContainer" containerID="e4851609b13daf386b9a75dd93b93d11e73aa47ed2720e3772d9de0eeedc4882" Mar 13 20:46:08 crc kubenswrapper[4790]: I0313 20:46:08.465866 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:46:08 crc kubenswrapper[4790]: I0313 20:46:08.770698 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-nrv7g"] Mar 13 20:46:08 crc kubenswrapper[4790]: I0313 20:46:08.956929 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.140690 4790 scope.go:117] "RemoveContainer" containerID="49d625d0111656eb749d168f5c6aa08a6533bb845529b49927ec4ee997aab45d" Mar 13 20:46:09 crc kubenswrapper[4790]: W0313 20:46:09.186027 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfb0e0ca_d164_4e22_9d3f_055a45a372d2.slice/crio-361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d WatchSource:0}: Error finding container 361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d: Status 404 returned error can't find the container with id 361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d Mar 13 20:46:09 crc kubenswrapper[4790]: W0313 20:46:09.261406 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5980214a_6a36_4a9b_bb65_1ca2b979d0cc.slice/crio-6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a WatchSource:0}: Error finding container 6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a: Status 404 returned error can't find the container with id 6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.304480 4790 scope.go:117] "RemoveContainer" containerID="195a67ad901633c714ab17db7e7888cdde3ea030f2d3e57f5ad0722e488347b8" Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.343619 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerStarted","Data":"6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a"} Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.348215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrv7g" event={"ID":"dfb0e0ca-d164-4e22-9d3f-055a45a372d2","Type":"ContainerStarted","Data":"361ffc08bda2f47fe8a49991e901f1ba5426f6bd7d04dd5c33869fc442ebda1d"} Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.349318 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" event={"ID":"97e8561a-a685-44f0-986c-1559e5818ba8","Type":"ContainerStarted","Data":"43aee0bab3af6a8bfdda4bb90672879a71cedb5993a623158c3730e25f5f67ba"} Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.500140 4790 scope.go:117] "RemoveContainer" containerID="8263c960933930e9418f327d1c70a8da265ccd7214d4f221c7150a432da81ec8" Mar 13 20:46:09 crc kubenswrapper[4790]: I0313 20:46:09.584231 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:09 crc kubenswrapper[4790]: W0313 20:46:09.823343 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd371e679_2539_4a57_9993_6bd66f0d311e.slice/crio-e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261 WatchSource:0}: Error finding container e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261: Status 404 returned error can't find the container with id e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261 Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.230249 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-57d769cc4f-rkgwq" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.100:5353: i/o timeout" Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.359839 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"3980f8da-ddaa-4634-8c09-1a71ae19c58f","Type":"ContainerStarted","Data":"38687d46bd8558e8ff19623ab6b544af687549c5f2646731edfb1896ed86a605"} Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.360522 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.363976 4790 generic.go:334] "Generic (PLEG): container finished" podID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" exitCode=0 Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.364053 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerDied","Data":"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6"} Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.365736 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerStarted","Data":"e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261"} Mar 13 20:46:10 crc kubenswrapper[4790]: I0313 20:46:10.408890 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.113558536 podStartE2EDuration="22.408873801s" podCreationTimestamp="2026-03-13 20:45:48 +0000 UTC" firstStartedPulling="2026-03-13 20:45:59.009503521 +0000 UTC m=+1090.030619412" lastFinishedPulling="2026-03-13 20:46:09.304818786 +0000 UTC m=+1100.325934677" observedRunningTime="2026-03-13 20:46:10.379674536 +0000 UTC m=+1101.400790437" watchObservedRunningTime="2026-03-13 20:46:10.408873801 +0000 UTC m=+1101.429989692" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.375272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerStarted","Data":"86e0632ee7d85ec4a092fdd91f4cf4501da9716d6ba3776527053aa4e34f6f82"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.378761 4790 generic.go:334] "Generic (PLEG): container finished" podID="97e8561a-a685-44f0-986c-1559e5818ba8" containerID="a76e1c0d1beff75ffaa42ee8715fd9733a320b575bcb2a1602abbb7840ddf694" exitCode=0 Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.378814 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" event={"ID":"97e8561a-a685-44f0-986c-1559e5818ba8","Type":"ContainerDied","Data":"a76e1c0d1beff75ffaa42ee8715fd9733a320b575bcb2a1602abbb7840ddf694"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.382931 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerStarted","Data":"7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.383622 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.391938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerStarted","Data":"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.415518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerStarted","Data":"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.415628 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.418958 4790 generic.go:334] "Generic (PLEG): container finished" podID="d371e679-2539-4a57-9993-6bd66f0d311e" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" exitCode=0 Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.418984 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerDied","Data":"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.420633 4790 generic.go:334] "Generic (PLEG): container finished" podID="8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58" containerID="c7b64ce449f3b79cdcd1395e4a62437aae1930467d8a3439c9aa108a81cbf57c" exitCode=0 Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.420685 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerDied","Data":"c7b64ce449f3b79cdcd1395e4a62437aae1930467d8a3439c9aa108a81cbf57c"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.423678 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f5a24d7e-902f-4862-9c6b-8317f8fb3f29","Type":"ContainerStarted","Data":"dbf49fcbf9f4f102f77702250fea555eb9b6bde16734c9db16b78132dfde5910"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.425811 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerStarted","Data":"0e42fee0e10bca8be201fe50501a5e82454ac7b9cad70b1bd8bc28c89423c299"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.428328 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ba4867dc-70fb-4533-a075-31fc03f7ef33","Type":"ContainerStarted","Data":"6dd4cb3a76b88d73adca13ca1234e6df84737d567b9b78049d28b310fb4f5f23"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.430298 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerStarted","Data":"a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.433511 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5" event={"ID":"c72ac557-7882-4120-b64a-4343639cc766","Type":"ContainerStarted","Data":"b56543c8608a9d7acf4c66ad8d1e279c901c3ebfb8dd790dfb8aad524883d947"} Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.433542 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-vspq5" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.471774 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=9.544165709 podStartE2EDuration="20.471755874s" podCreationTimestamp="2026-03-13 20:45:51 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.986122306 +0000 UTC m=+1090.007238197" lastFinishedPulling="2026-03-13 20:46:09.913712471 +0000 UTC m=+1100.934828362" observedRunningTime="2026-03-13 20:46:11.450727882 +0000 UTC m=+1102.471843773" watchObservedRunningTime="2026-03-13 20:46:11.471755874 +0000 UTC m=+1102.492871755" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.513102 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" podStartSLOduration=11.513081049 podStartE2EDuration="11.513081049s" podCreationTimestamp="2026-03-13 20:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:11.50911847 +0000 UTC m=+1102.530234371" watchObservedRunningTime="2026-03-13 20:46:11.513081049 +0000 UTC m=+1102.534196940" Mar 13 20:46:11 crc kubenswrapper[4790]: I0313 20:46:11.556779 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-vspq5" podStartSLOduration=6.753032632 podStartE2EDuration="17.556760316s" podCreationTimestamp="2026-03-13 20:45:54 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.999747907 +0000 UTC m=+1090.020863798" lastFinishedPulling="2026-03-13 20:46:09.803475591 +0000 UTC m=+1100.824591482" observedRunningTime="2026-03-13 20:46:11.554533866 +0000 UTC m=+1102.575649757" watchObservedRunningTime="2026-03-13 20:46:11.556760316 +0000 UTC m=+1102.577876207" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.454315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" event={"ID":"97e8561a-a685-44f0-986c-1559e5818ba8","Type":"ContainerDied","Data":"43aee0bab3af6a8bfdda4bb90672879a71cedb5993a623158c3730e25f5f67ba"} Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.454803 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43aee0bab3af6a8bfdda4bb90672879a71cedb5993a623158c3730e25f5f67ba" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.516330 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.589183 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") pod \"97e8561a-a685-44f0-986c-1559e5818ba8\" (UID: \"97e8561a-a685-44f0-986c-1559e5818ba8\") " Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.594674 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9" (OuterVolumeSpecName: "kube-api-access-mtpm9") pod "97e8561a-a685-44f0-986c-1559e5818ba8" (UID: "97e8561a-a685-44f0-986c-1559e5818ba8"). InnerVolumeSpecName "kube-api-access-mtpm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:13 crc kubenswrapper[4790]: I0313 20:46:13.692037 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtpm9\" (UniqueName: \"kubernetes.io/projected/97e8561a-a685-44f0-986c-1559e5818ba8-kube-api-access-mtpm9\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.465515 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerStarted","Data":"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.465934 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.468353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"ba4867dc-70fb-4533-a075-31fc03f7ef33","Type":"ContainerStarted","Data":"70a4726ebbb5eeec16130e19d4e9f480c13925eada3756ca2318b730eb7fce0e"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.469889 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-nrv7g" event={"ID":"dfb0e0ca-d164-4e22-9d3f-055a45a372d2","Type":"ContainerStarted","Data":"adc15f321ac61e0a2850691c665b578baab9c591f9b64c5be8f302eda5223247"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472134 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerStarted","Data":"0c13caebc66cb1cb2f2fc42dc4ef55e548b61bf3abdcdbc2c1d4701e7157becb"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472158 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-k7bzr" event={"ID":"8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58","Type":"ContainerStarted","Data":"54e2add76348c89f728c121ee9ca9e2012e5b1c658d6628cf3428d30665141c8"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472205 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.472232 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.473971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f5a24d7e-902f-4862-9c6b-8317f8fb3f29","Type":"ContainerStarted","Data":"a44b1c232e61c6d1d71c2c09572fa8266651ab19b81937b993ea6c9db5e2c25a"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.475435 4790 generic.go:334] "Generic (PLEG): container finished" podID="fa2face0-9349-4482-880a-b23cf41099b2" containerID="0e42fee0e10bca8be201fe50501a5e82454ac7b9cad70b1bd8bc28c89423c299" exitCode=0 Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.475516 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerDied","Data":"0e42fee0e10bca8be201fe50501a5e82454ac7b9cad70b1bd8bc28c89423c299"} Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.475737 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557246-lrvrv" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.510543 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" podStartSLOduration=15.510524869 podStartE2EDuration="15.510524869s" podCreationTimestamp="2026-03-13 20:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:14.487715818 +0000 UTC m=+1105.508831719" watchObservedRunningTime="2026-03-13 20:46:14.510524869 +0000 UTC m=+1105.531640760" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.516232 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=6.870753563 podStartE2EDuration="17.516211113s" podCreationTimestamp="2026-03-13 20:45:57 +0000 UTC" firstStartedPulling="2026-03-13 20:46:02.751107935 +0000 UTC m=+1093.772223826" lastFinishedPulling="2026-03-13 20:46:13.396565485 +0000 UTC m=+1104.417681376" observedRunningTime="2026-03-13 20:46:14.50582138 +0000 UTC m=+1105.526937271" watchObservedRunningTime="2026-03-13 20:46:14.516211113 +0000 UTC m=+1105.537327004" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.537005 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=7.237340293 podStartE2EDuration="21.536984218s" podCreationTimestamp="2026-03-13 20:45:53 +0000 UTC" firstStartedPulling="2026-03-13 20:45:59.081210082 +0000 UTC m=+1090.102325973" lastFinishedPulling="2026-03-13 20:46:13.380854007 +0000 UTC m=+1104.401969898" observedRunningTime="2026-03-13 20:46:14.532516847 +0000 UTC m=+1105.553632758" watchObservedRunningTime="2026-03-13 20:46:14.536984218 +0000 UTC m=+1105.558100109" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.574039 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-k7bzr" podStartSLOduration=10.580376808 podStartE2EDuration="20.574018376s" podCreationTimestamp="2026-03-13 20:45:54 +0000 UTC" firstStartedPulling="2026-03-13 20:45:59.310799458 +0000 UTC m=+1090.331915349" lastFinishedPulling="2026-03-13 20:46:09.304441026 +0000 UTC m=+1100.325556917" observedRunningTime="2026-03-13 20:46:14.566634315 +0000 UTC m=+1105.587750206" watchObservedRunningTime="2026-03-13 20:46:14.574018376 +0000 UTC m=+1105.595134267" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.610071 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.616233 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-nrv7g" podStartSLOduration=11.425832602 podStartE2EDuration="15.616215603s" podCreationTimestamp="2026-03-13 20:45:59 +0000 UTC" firstStartedPulling="2026-03-13 20:46:09.190701243 +0000 UTC m=+1100.211817134" lastFinishedPulling="2026-03-13 20:46:13.381084244 +0000 UTC m=+1104.402200135" observedRunningTime="2026-03-13 20:46:14.583187385 +0000 UTC m=+1105.604303276" watchObservedRunningTime="2026-03-13 20:46:14.616215603 +0000 UTC m=+1105.637331494" Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.617451 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557240-8qw5d"] Mar 13 20:46:14 crc kubenswrapper[4790]: I0313 20:46:14.861244 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.485680 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"fa2face0-9349-4482-880a-b23cf41099b2","Type":"ContainerStarted","Data":"ecd707787ab07582c744789073f71c6681c500cf47b72baeab75726ed26695eb"} Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.488005 4790 generic.go:334] "Generic (PLEG): container finished" podID="fceb0829-5f0e-4e78-a803-61afc5aa4d60" containerID="86e0632ee7d85ec4a092fdd91f4cf4501da9716d6ba3776527053aa4e34f6f82" exitCode=0 Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.488146 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerDied","Data":"86e0632ee7d85ec4a092fdd91f4cf4501da9716d6ba3776527053aa4e34f6f82"} Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.514569 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=17.469506821 podStartE2EDuration="28.514541481s" podCreationTimestamp="2026-03-13 20:45:47 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.229924745 +0000 UTC m=+1089.251040636" lastFinishedPulling="2026-03-13 20:46:09.274959405 +0000 UTC m=+1100.296075296" observedRunningTime="2026-03-13 20:46:15.505419983 +0000 UTC m=+1106.526535884" watchObservedRunningTime="2026-03-13 20:46:15.514541481 +0000 UTC m=+1106.535657382" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.670970 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f1fa3a-7f88-4e89-bd00-4426798fccce" path="/var/lib/kubelet/pods/f6f1fa3a-7f88-4e89-bd00-4426798fccce/volumes" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.675539 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.739824 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.861695 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:15 crc kubenswrapper[4790]: I0313 20:46:15.906855 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.498658 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"fceb0829-5f0e-4e78-a803-61afc5aa4d60","Type":"ContainerStarted","Data":"7367436a9fe0a765fc7d324d2f702d2ba65dbe1ea3313bfa3b02a185aee63c92"} Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.498999 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" containerID="cri-o://b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" gracePeriod=10 Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.531586 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=21.411194354 podStartE2EDuration="30.531566487s" podCreationTimestamp="2026-03-13 20:45:46 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.875147186 +0000 UTC m=+1089.896263077" lastFinishedPulling="2026-03-13 20:46:07.995519309 +0000 UTC m=+1099.016635210" observedRunningTime="2026-03-13 20:46:16.522426628 +0000 UTC m=+1107.543542559" watchObservedRunningTime="2026-03-13 20:46:16.531566487 +0000 UTC m=+1107.552682388" Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.547967 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 13 20:46:16 crc kubenswrapper[4790]: I0313 20:46:16.960794 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059593 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059763 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.059848 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") pod \"d371e679-2539-4a57-9993-6bd66f0d311e\" (UID: \"d371e679-2539-4a57-9993-6bd66f0d311e\") " Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.065836 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4" (OuterVolumeSpecName: "kube-api-access-f68w4") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "kube-api-access-f68w4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.100392 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config" (OuterVolumeSpecName: "config") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.100411 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.101876 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d371e679-2539-4a57-9993-6bd66f0d311e" (UID: "d371e679-2539-4a57-9993-6bd66f0d311e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161043 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161071 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161083 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f68w4\" (UniqueName: \"kubernetes.io/projected/d371e679-2539-4a57-9993-6bd66f0d311e-kube-api-access-f68w4\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.161092 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d371e679-2539-4a57-9993-6bd66f0d311e-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.212675 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.258839 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510260 4790 generic.go:334] "Generic (PLEG): container finished" podID="d371e679-2539-4a57-9993-6bd66f0d311e" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" exitCode=0 Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerDied","Data":"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb"} Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510367 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510417 4790 scope.go:117] "RemoveContainer" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fd796d7df-wrjx8" event={"ID":"d371e679-2539-4a57-9993-6bd66f0d311e","Type":"ContainerDied","Data":"e1ec4e6dbb07edb7337832aef993c9775ea9bc7102522ba760952f131af54261"} Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.510754 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.534436 4790 scope.go:117] "RemoveContainer" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.544255 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.550633 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fd796d7df-wrjx8"] Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.551214 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.577736 4790 scope.go:117] "RemoveContainer" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.579778 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb\": container with ID starting with b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb not found: ID does not exist" containerID="b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.579822 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb"} err="failed to get container status \"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb\": rpc error: code = NotFound desc = could not find container \"b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb\": container with ID starting with b1ccd5dc1b5fa2962a791f0f9bd1dc99dc41dc232bde69af0c11efe8fd4b0edb not found: ID does not exist" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.579849 4790 scope.go:117] "RemoveContainer" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.579961 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.581353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.581510 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03\": container with ID starting with 35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03 not found: ID does not exist" containerID="35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.581549 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03"} err="failed to get container status \"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03\": rpc error: code = NotFound desc = could not find container \"35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03\": container with ID starting with 35c31d96d7cafc55ea2241f29c1688f2e4d149d458481295371f887866ab8d03 not found: ID does not exist" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.671820 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" path="/var/lib/kubelet/pods/d371e679-2539-4a57-9993-6bd66f0d311e/volumes" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.702989 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703314 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703340 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703346 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703362 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703369 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.703408 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" containerName="oc" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.703419 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" containerName="oc" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.705700 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.705731 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="init" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.705766 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.705774 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: E0313 20:46:17.705798 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.705806 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706072 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" containerName="oc" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706084 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3603867e-b715-48af-b4d3-248f69035bf4" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706094 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b63dd900-9f63-4b6a-b620-bd1dfaa88cfe" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706111 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d371e679-2539-4a57-9993-6bd66f0d311e" containerName="dnsmasq-dns" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.706911 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.718910 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.719238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-96kb4" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.719879 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.720028 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.721341 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4knsx\" (UniqueName: \"kubernetes.io/projected/18e18c94-0ce6-4578-a224-384826512a34-kube-api-access-4knsx\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872390 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-config\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18e18c94-0ce6-4578-a224-384826512a34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872750 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.872811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-scripts\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.974937 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18e18c94-0ce6-4578-a224-384826512a34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975011 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975031 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-scripts\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4knsx\" (UniqueName: \"kubernetes.io/projected/18e18c94-0ce6-4578-a224-384826512a34-kube-api-access-4knsx\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-config\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.975528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/18e18c94-0ce6-4578-a224-384826512a34-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.976259 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-scripts\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.976264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18e18c94-0ce6-4578-a224-384826512a34-config\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.979643 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.979874 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.986087 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/18e18c94-0ce6-4578-a224-384826512a34-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:17 crc kubenswrapper[4790]: I0313 20:46:17.993008 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4knsx\" (UniqueName: \"kubernetes.io/projected/18e18c94-0ce6-4578-a224-384826512a34-kube-api-access-4knsx\") pod \"ovn-northd-0\" (UID: \"18e18c94-0ce6-4578-a224-384826512a34\") " pod="openstack/ovn-northd-0" Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.029436 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 13 20:46:18 crc kubenswrapper[4790]: W0313 20:46:18.450886 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod18e18c94_0ce6_4578_a224_384826512a34.slice/crio-c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719 WatchSource:0}: Error finding container c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719: Status 404 returned error can't find the container with id c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719 Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.456796 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.519526 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18e18c94-0ce6-4578-a224-384826512a34","Type":"ContainerStarted","Data":"c97fd7a25afd369d5bd6017efa0660e834e67bb594f27cde893e62a7317b2719"} Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.939039 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:18 crc kubenswrapper[4790]: I0313 20:46:18.939112 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:19 crc kubenswrapper[4790]: I0313 20:46:19.248223 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 13 20:46:19 crc kubenswrapper[4790]: E0313 20:46:19.501869 4790 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.143:36798->38.102.83.143:39163: write tcp 38.102.83.143:36798->38.102.83.143:39163: write: broken pipe Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.537109 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18e18c94-0ce6-4578-a224-384826512a34","Type":"ContainerStarted","Data":"0bdeecb17f1f462f15f7514a6de1d42f2a2888bfdb2c48b2e4f3fa9a499f4076"} Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.537449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"18e18c94-0ce6-4578-a224-384826512a34","Type":"ContainerStarted","Data":"195304f3dcc55f6eff0508aa06cf89a02ba45b95a698d078d2a77be9a2a267ba"} Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.537465 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 13 20:46:20 crc kubenswrapper[4790]: I0313 20:46:20.561627 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.509577958 podStartE2EDuration="3.561590037s" podCreationTimestamp="2026-03-13 20:46:17 +0000 UTC" firstStartedPulling="2026-03-13 20:46:18.453133759 +0000 UTC m=+1109.474249650" lastFinishedPulling="2026-03-13 20:46:19.505145838 +0000 UTC m=+1110.526261729" observedRunningTime="2026-03-13 20:46:20.55617708 +0000 UTC m=+1111.577292971" watchObservedRunningTime="2026-03-13 20:46:20.561590037 +0000 UTC m=+1111.582705978" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.352829 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.354228 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.365367 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.415648 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433541 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433594 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.433948 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.531843 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.534929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.534981 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.535043 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.535069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.535117 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.536268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.536575 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.538026 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.539285 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.554924 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"dnsmasq-dns-698758b865-gv56q\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.616491 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 13 20:46:21 crc kubenswrapper[4790]: I0313 20:46:21.676929 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:22 crc kubenswrapper[4790]: W0313 20:46:22.117138 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd798b6d8_8c2b_4827_81d3_09177054591f.slice/crio-c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3 WatchSource:0}: Error finding container c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3: Status 404 returned error can't find the container with id c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3 Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.126919 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.483844 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.490086 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.492084 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.492574 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-ngh5j" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.492626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.493356 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.504206 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.551203 4790 generic.go:334] "Generic (PLEG): container finished" podID="d798b6d8-8c2b-4827-81d3-09177054591f" containerID="978e68813566a9c04dd155a064a373e2649857a2eefbe05ca9b8949d3e9db280" exitCode=0 Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.551244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerDied","Data":"978e68813566a9c04dd155a064a373e2649857a2eefbe05ca9b8949d3e9db280"} Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.551268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerStarted","Data":"c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3"} Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-lock\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552166 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552247 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529b41ec-f1ee-432c-ac41-6957e1809aaa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-cache\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cr9q\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-kube-api-access-6cr9q\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.552450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.653585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529b41ec-f1ee-432c-ac41-6957e1809aaa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.653841 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-cache\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.653927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cr9q\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-kube-api-access-6cr9q\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.654061 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.654174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-lock\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.654272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: E0313 20:46:22.654467 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:22 crc kubenswrapper[4790]: E0313 20:46:22.654533 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:22 crc kubenswrapper[4790]: E0313 20:46:22.654642 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:23.154622474 +0000 UTC m=+1114.175738365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.656006 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.657312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-cache\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.657414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/529b41ec-f1ee-432c-ac41-6957e1809aaa-lock\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.660011 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/529b41ec-f1ee-432c-ac41-6957e1809aaa-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.676206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cr9q\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-kube-api-access-6cr9q\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:22 crc kubenswrapper[4790]: I0313 20:46:22.687652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.099414 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-dv686"] Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.100768 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.102409 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.102773 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.105920 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.113629 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dv686"] Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164092 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164145 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164358 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164573 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.164661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:23 crc kubenswrapper[4790]: E0313 20:46:23.164789 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:23 crc kubenswrapper[4790]: E0313 20:46:23.164815 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:23 crc kubenswrapper[4790]: E0313 20:46:23.164885 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:24.164861564 +0000 UTC m=+1115.185977455 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.265849 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.266042 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.266142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.267877 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268209 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268266 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.268562 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.269100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.269222 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.269901 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.270972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.271030 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.286538 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"swift-ring-rebalance-dv686\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.416681 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.575933 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerStarted","Data":"e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb"} Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.576319 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.604462 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-gv56q" podStartSLOduration=2.604441642 podStartE2EDuration="2.604441642s" podCreationTimestamp="2026-03-13 20:46:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:23.604037391 +0000 UTC m=+1114.625153292" watchObservedRunningTime="2026-03-13 20:46:23.604441642 +0000 UTC m=+1114.625557533" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.699104 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.762398 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 13 20:46:23 crc kubenswrapper[4790]: I0313 20:46:23.877278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-dv686"] Mar 13 20:46:24 crc kubenswrapper[4790]: I0313 20:46:24.181942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:24 crc kubenswrapper[4790]: E0313 20:46:24.182103 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:24 crc kubenswrapper[4790]: E0313 20:46:24.182124 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:24 crc kubenswrapper[4790]: E0313 20:46:24.182185 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:26.182165528 +0000 UTC m=+1117.203281419 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:24 crc kubenswrapper[4790]: I0313 20:46:24.585172 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerStarted","Data":"e3681864143fdf49c4108aa2fae3bb58046cc42f144960f544964a65dc7f5591"} Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.217405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:26 crc kubenswrapper[4790]: E0313 20:46:26.217825 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:26 crc kubenswrapper[4790]: E0313 20:46:26.217924 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:26 crc kubenswrapper[4790]: E0313 20:46:26.218002 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:30.217983819 +0000 UTC m=+1121.239099710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.371768 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.372706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.380426 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.387934 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.521498 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.521825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.623012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.623138 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.623917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.642463 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"root-account-create-update-n4fjc\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:26 crc kubenswrapper[4790]: I0313 20:46:26.705123 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:27 crc kubenswrapper[4790]: I0313 20:46:27.977969 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:27 crc kubenswrapper[4790]: W0313 20:46:27.983653 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d3dd8de_0de0_4703_a067_446d2822860d.slice/crio-489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c WatchSource:0}: Error finding container 489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c: Status 404 returned error can't find the container with id 489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.615673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerStarted","Data":"0b6241fd3bfe8fbe3b943719b842facbdec444bbd9bc9d23531d0137fa8a476f"} Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.620142 4790 generic.go:334] "Generic (PLEG): container finished" podID="5d3dd8de-0de0-4703-a067-446d2822860d" containerID="a469cae8d28a17763807dc70d5fbc5f435ef49995e55c306927cfc053eea835d" exitCode=0 Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.620185 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4fjc" event={"ID":"5d3dd8de-0de0-4703-a067-446d2822860d","Type":"ContainerDied","Data":"a469cae8d28a17763807dc70d5fbc5f435ef49995e55c306927cfc053eea835d"} Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.620208 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4fjc" event={"ID":"5d3dd8de-0de0-4703-a067-446d2822860d","Type":"ContainerStarted","Data":"489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c"} Mar 13 20:46:28 crc kubenswrapper[4790]: I0313 20:46:28.641943 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-dv686" podStartSLOduration=1.770172504 podStartE2EDuration="5.641922588s" podCreationTimestamp="2026-03-13 20:46:23 +0000 UTC" firstStartedPulling="2026-03-13 20:46:23.882605899 +0000 UTC m=+1114.903721790" lastFinishedPulling="2026-03-13 20:46:27.754355983 +0000 UTC m=+1118.775471874" observedRunningTime="2026-03-13 20:46:28.63793745 +0000 UTC m=+1119.659053351" watchObservedRunningTime="2026-03-13 20:46:28.641922588 +0000 UTC m=+1119.663038479" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.062797 4790 scope.go:117] "RemoveContainer" containerID="a4421190e0f8f7d5d0550c9770d73abc8a710d933f4a6e67738054d90201114f" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.380932 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.382815 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.390742 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.475722 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.477242 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.479330 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.484657 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.576811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.577329 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.678829 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.679505 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.679724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.679984 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.680846 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.700353 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"glance-db-create-qflsz\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.712082 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.783224 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.783365 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.784102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.848494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"glance-bc9a-account-create-update-7s4hb\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:29 crc kubenswrapper[4790]: I0313 20:46:29.996938 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.090773 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.193143 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") pod \"5d3dd8de-0de0-4703-a067-446d2822860d\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.193170 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") pod \"5d3dd8de-0de0-4703-a067-446d2822860d\" (UID: \"5d3dd8de-0de0-4703-a067-446d2822860d\") " Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.194292 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5d3dd8de-0de0-4703-a067-446d2822860d" (UID: "5d3dd8de-0de0-4703-a067-446d2822860d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.200706 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4" (OuterVolumeSpecName: "kube-api-access-8d7l4") pod "5d3dd8de-0de0-4703-a067-446d2822860d" (UID: "5d3dd8de-0de0-4703-a067-446d2822860d"). InnerVolumeSpecName "kube-api-access-8d7l4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.201837 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:46:30 crc kubenswrapper[4790]: W0313 20:46:30.208863 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bfc00cf_9a76_4b6f_a8f5_315af824814d.slice/crio-ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03 WatchSource:0}: Error finding container ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03: Status 404 returned error can't find the container with id ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03 Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.278993 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.279655 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" containerName="mariadb-account-create-update" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.279695 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" containerName="mariadb-account-create-update" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.279946 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" containerName="mariadb-account-create-update" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.280749 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.294978 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295033 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295210 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295293 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5d3dd8de-0de0-4703-a067-446d2822860d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.295310 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8d7l4\" (UniqueName: \"kubernetes.io/projected/5d3dd8de-0de0-4703-a067-446d2822860d-kube-api-access-8d7l4\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.295370 4790 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.295416 4790 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 13 20:46:30 crc kubenswrapper[4790]: E0313 20:46:30.295484 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift podName:529b41ec-f1ee-432c-ac41-6957e1809aaa nodeName:}" failed. No retries permitted until 2026-03-13 20:46:38.295449118 +0000 UTC m=+1129.316565009 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift") pod "swift-storage-0" (UID: "529b41ec-f1ee-432c-ac41-6957e1809aaa") : configmap "swift-ring-files" not found Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.298037 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.392394 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.395058 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396412 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396475 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.396579 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.397102 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.397626 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.400732 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.421132 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"keystone-db-create-dtps4\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.494278 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.498214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.498263 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.498935 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.514400 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"keystone-6245-account-create-update-5tjxd\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.579832 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.581305 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.599909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.599975 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.608460 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.609431 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.613806 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.616479 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.626899 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.642653 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerStarted","Data":"3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.642770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerStarted","Data":"ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.649034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-n4fjc" event={"ID":"5d3dd8de-0de0-4703-a067-446d2822860d","Type":"ContainerDied","Data":"489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.649069 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="489d3c25ae47d4be64662e82f5e8ce80011fc5028c50363476e2c336894ee85c" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.649044 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-n4fjc" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.652186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc9a-account-create-update-7s4hb" event={"ID":"1b5f7e2a-401c-4a9f-9222-5037f9d1d499","Type":"ContainerStarted","Data":"8d620e2ac11015f6738329101767fa1a633c874417fba59eff69e65d3e55a8a1"} Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.678448 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-create-qflsz" podStartSLOduration=1.678433337 podStartE2EDuration="1.678433337s" podCreationTimestamp="2026-03-13 20:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:30.661972119 +0000 UTC m=+1121.683088010" watchObservedRunningTime="2026-03-13 20:46:30.678433337 +0000 UTC m=+1121.699549228" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.681721 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-bc9a-account-create-update-7s4hb" podStartSLOduration=1.681713646 podStartE2EDuration="1.681713646s" podCreationTimestamp="2026-03-13 20:46:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:30.675996421 +0000 UTC m=+1121.697112302" watchObservedRunningTime="2026-03-13 20:46:30.681713646 +0000 UTC m=+1121.702829537" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.684882 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.702692 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703074 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.703936 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.723119 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"placement-db-create-swgpr\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.732756 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.805129 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.805499 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.806224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.828493 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"placement-76eb-account-create-update-fsrb9\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.907022 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:30 crc kubenswrapper[4790]: I0313 20:46:30.929491 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.115343 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.124704 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20a3a1cb_c500_4355_ae67_649e381b1b88.slice/crio-bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d WatchSource:0}: Error finding container bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d: Status 404 returned error can't find the container with id bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.240151 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.247405 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a7e4224_0922_4f9a_af94_0a9933f27530.slice/crio-b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c WatchSource:0}: Error finding container b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c: Status 404 returned error can't find the container with id b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.372730 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.373708 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0a6f76f_d9d1_4ab9_ac4c_e483e55926a0.slice/crio-9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411 WatchSource:0}: Error finding container 9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411: Status 404 returned error can't find the container with id 9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.455424 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:46:31 crc kubenswrapper[4790]: W0313 20:46:31.487317 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b6f7fe9_fb1f_430c_80e5_0dbe98da2b9c.slice/crio-8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1 WatchSource:0}: Error finding container 8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1: Status 404 returned error can't find the container with id 8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.666070 4790 generic.go:334] "Generic (PLEG): container finished" podID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerID="3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.669167 4790 generic.go:334] "Generic (PLEG): container finished" podID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerID="eaedee9332ceb5ac2c43fa820fcea3e6086d5dfda3317381786c3cc819576b44" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.671325 4790 generic.go:334] "Generic (PLEG): container finished" podID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerID="e50d6c82675c18c36b9041dc6a13dffb21bb7a9c1cb73ee61c06ce0d61f0b9b3" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673163 4790 generic.go:334] "Generic (PLEG): container finished" podID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerID="ff9f56b80e2e388086557f7fc707002adf2609bc96cff97367abf262894bf61f" exitCode=0 Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673434 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerStarted","Data":"2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerStarted","Data":"8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerDied","Data":"3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673902 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dtps4" event={"ID":"20a3a1cb-c500-4355-ae67-649e381b1b88","Type":"ContainerDied","Data":"eaedee9332ceb5ac2c43fa820fcea3e6086d5dfda3317381786c3cc819576b44"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dtps4" event={"ID":"20a3a1cb-c500-4355-ae67-649e381b1b88","Type":"ContainerStarted","Data":"bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6245-account-create-update-5tjxd" event={"ID":"2a7e4224-0922-4f9a-af94-0a9933f27530","Type":"ContainerDied","Data":"e50d6c82675c18c36b9041dc6a13dffb21bb7a9c1cb73ee61c06ce0d61f0b9b3"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673946 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6245-account-create-update-5tjxd" event={"ID":"2a7e4224-0922-4f9a-af94-0a9933f27530","Type":"ContainerStarted","Data":"b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.673958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc9a-account-create-update-7s4hb" event={"ID":"1b5f7e2a-401c-4a9f-9222-5037f9d1d499","Type":"ContainerDied","Data":"ff9f56b80e2e388086557f7fc707002adf2609bc96cff97367abf262894bf61f"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.674796 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerStarted","Data":"5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.674828 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerStarted","Data":"9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411"} Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.686540 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.701644 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-76eb-account-create-update-fsrb9" podStartSLOduration=1.701619511 podStartE2EDuration="1.701619511s" podCreationTimestamp="2026-03-13 20:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:31.68430346 +0000 UTC m=+1122.705419351" watchObservedRunningTime="2026-03-13 20:46:31.701619511 +0000 UTC m=+1122.722735412" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.708560 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-swgpr" podStartSLOduration=1.708537519 podStartE2EDuration="1.708537519s" podCreationTimestamp="2026-03-13 20:46:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:31.69570904 +0000 UTC m=+1122.716824941" watchObservedRunningTime="2026-03-13 20:46:31.708537519 +0000 UTC m=+1122.729653430" Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.823677 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:31 crc kubenswrapper[4790]: I0313 20:46:31.824326 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" containerID="cri-o://e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" gracePeriod=10 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.369322 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437302 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437362 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437400 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.437505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") pod \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\" (UID: \"5980214a-6a36-4a9b-bb65-1ca2b979d0cc\") " Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.456561 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k" (OuterVolumeSpecName: "kube-api-access-hcv5k") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "kube-api-access-hcv5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.490343 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.505415 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.511828 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config" (OuterVolumeSpecName: "config") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.525493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5980214a-6a36-4a9b-bb65-1ca2b979d0cc" (UID: "5980214a-6a36-4a9b-bb65-1ca2b979d0cc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539513 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539551 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539567 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539577 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.539589 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcv5k\" (UniqueName: \"kubernetes.io/projected/5980214a-6a36-4a9b-bb65-1ca2b979d0cc-kube-api-access-hcv5k\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.573372 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.580190 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-n4fjc"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.684735 4790 generic.go:334] "Generic (PLEG): container finished" podID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" exitCode=0 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.684833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerDied","Data":"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.685050 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" event={"ID":"5980214a-6a36-4a9b-bb65-1ca2b979d0cc","Type":"ContainerDied","Data":"6e492e6d818823347461bc8ffdd10cbcadc5631db367ed321826eafcc8fcf49a"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.684850 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-7w2fv" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.685072 4790 scope.go:117] "RemoveContainer" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.686727 4790 generic.go:334] "Generic (PLEG): container finished" podID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerID="5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498" exitCode=0 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.686817 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerDied","Data":"5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.689218 4790 generic.go:334] "Generic (PLEG): container finished" podID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerID="2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368" exitCode=0 Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.689479 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerDied","Data":"2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368"} Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.715141 4790 scope.go:117] "RemoveContainer" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.745288 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.750457 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-7w2fv"] Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.785323 4790 scope.go:117] "RemoveContainer" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" Mar 13 20:46:32 crc kubenswrapper[4790]: E0313 20:46:32.790155 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c\": container with ID starting with e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c not found: ID does not exist" containerID="e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.790193 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c"} err="failed to get container status \"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c\": rpc error: code = NotFound desc = could not find container \"e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c\": container with ID starting with e1de77625769e7f07c9aa81ebf54b2b90e05a4acaf37af2a5d100e9b4cd0aa4c not found: ID does not exist" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.790215 4790 scope.go:117] "RemoveContainer" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" Mar 13 20:46:32 crc kubenswrapper[4790]: E0313 20:46:32.792427 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6\": container with ID starting with d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6 not found: ID does not exist" containerID="d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6" Mar 13 20:46:32 crc kubenswrapper[4790]: I0313 20:46:32.792475 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6"} err="failed to get container status \"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6\": rpc error: code = NotFound desc = could not find container \"d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6\": container with ID starting with d7c869a2f8b8b93e2f3889d9c52b9758cafc80caf9c5547281b3ea804ea29dd6 not found: ID does not exist" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.023412 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.157844 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") pod \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.157937 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") pod \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\" (UID: \"1b5f7e2a-401c-4a9f-9222-5037f9d1d499\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.159058 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1b5f7e2a-401c-4a9f-9222-5037f9d1d499" (UID: "1b5f7e2a-401c-4a9f-9222-5037f9d1d499"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.167625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh" (OuterVolumeSpecName: "kube-api-access-t9zjh") pod "1b5f7e2a-401c-4a9f-9222-5037f9d1d499" (UID: "1b5f7e2a-401c-4a9f-9222-5037f9d1d499"). InnerVolumeSpecName "kube-api-access-t9zjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.237273 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.244142 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.260702 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.260741 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t9zjh\" (UniqueName: \"kubernetes.io/projected/1b5f7e2a-401c-4a9f-9222-5037f9d1d499-kube-api-access-t9zjh\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.287305 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361238 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") pod \"2a7e4224-0922-4f9a-af94-0a9933f27530\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361431 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") pod \"20a3a1cb-c500-4355-ae67-649e381b1b88\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361461 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") pod \"2a7e4224-0922-4f9a-af94-0a9933f27530\" (UID: \"2a7e4224-0922-4f9a-af94-0a9933f27530\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.361513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") pod \"20a3a1cb-c500-4355-ae67-649e381b1b88\" (UID: \"20a3a1cb-c500-4355-ae67-649e381b1b88\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.362142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2a7e4224-0922-4f9a-af94-0a9933f27530" (UID: "2a7e4224-0922-4f9a-af94-0a9933f27530"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.362197 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20a3a1cb-c500-4355-ae67-649e381b1b88" (UID: "20a3a1cb-c500-4355-ae67-649e381b1b88"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.364639 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw" (OuterVolumeSpecName: "kube-api-access-rrddw") pod "20a3a1cb-c500-4355-ae67-649e381b1b88" (UID: "20a3a1cb-c500-4355-ae67-649e381b1b88"). InnerVolumeSpecName "kube-api-access-rrddw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.364770 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq" (OuterVolumeSpecName: "kube-api-access-sfgwq") pod "2a7e4224-0922-4f9a-af94-0a9933f27530" (UID: "2a7e4224-0922-4f9a-af94-0a9933f27530"). InnerVolumeSpecName "kube-api-access-sfgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.462332 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") pod \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.462620 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") pod \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\" (UID: \"9bfc00cf-9a76-4b6f-a8f5-315af824814d\") " Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.462845 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9bfc00cf-9a76-4b6f-a8f5-315af824814d" (UID: "9bfc00cf-9a76-4b6f-a8f5-315af824814d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463025 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bfc00cf-9a76-4b6f-a8f5-315af824814d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463042 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rrddw\" (UniqueName: \"kubernetes.io/projected/20a3a1cb-c500-4355-ae67-649e381b1b88-kube-api-access-rrddw\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463055 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2a7e4224-0922-4f9a-af94-0a9933f27530-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463068 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20a3a1cb-c500-4355-ae67-649e381b1b88-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.463080 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfgwq\" (UniqueName: \"kubernetes.io/projected/2a7e4224-0922-4f9a-af94-0a9933f27530-kube-api-access-sfgwq\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.465534 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn" (OuterVolumeSpecName: "kube-api-access-k76nn") pod "9bfc00cf-9a76-4b6f-a8f5-315af824814d" (UID: "9bfc00cf-9a76-4b6f-a8f5-315af824814d"). InnerVolumeSpecName "kube-api-access-k76nn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.564624 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k76nn\" (UniqueName: \"kubernetes.io/projected/9bfc00cf-9a76-4b6f-a8f5-315af824814d-kube-api-access-k76nn\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.671955 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" path="/var/lib/kubelet/pods/5980214a-6a36-4a9b-bb65-1ca2b979d0cc/volumes" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.673726 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d3dd8de-0de0-4703-a067-446d2822860d" path="/var/lib/kubelet/pods/5d3dd8de-0de0-4703-a067-446d2822860d/volumes" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.710746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-bc9a-account-create-update-7s4hb" event={"ID":"1b5f7e2a-401c-4a9f-9222-5037f9d1d499","Type":"ContainerDied","Data":"8d620e2ac11015f6738329101767fa1a633c874417fba59eff69e65d3e55a8a1"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.710787 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d620e2ac11015f6738329101767fa1a633c874417fba59eff69e65d3e55a8a1" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.710845 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-bc9a-account-create-update-7s4hb" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.714082 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-qflsz" event={"ID":"9bfc00cf-9a76-4b6f-a8f5-315af824814d","Type":"ContainerDied","Data":"ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.714123 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddf21cc08c3a5bbf2369906562b5c9d06661c39fc0ab46bfc76292c3cc9a4b03" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.714171 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-qflsz" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.716715 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-dtps4" event={"ID":"20a3a1cb-c500-4355-ae67-649e381b1b88","Type":"ContainerDied","Data":"bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.716735 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-dtps4" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.716743 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb5bd2129dbed8046248b1ede3e1cd1faabe029807cfd8a506d54cd72633b98d" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.720688 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-6245-account-create-update-5tjxd" event={"ID":"2a7e4224-0922-4f9a-af94-0a9933f27530","Type":"ContainerDied","Data":"b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c"} Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.720723 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a6527df1a629bb9f718a9d6c1b688f92dbe90739009ea26c240cdbeb3c3d7c" Mar 13 20:46:33 crc kubenswrapper[4790]: I0313 20:46:33.720818 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-6245-account-create-update-5tjxd" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.128457 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.134045 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277057 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") pod \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277551 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") pod \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277609 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") pod \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\" (UID: \"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.277714 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") pod \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\" (UID: \"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0\") " Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.278167 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" (UID: "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.278512 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" (UID: "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.284524 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp" (OuterVolumeSpecName: "kube-api-access-jtlnp") pod "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" (UID: "0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c"). InnerVolumeSpecName "kube-api-access-jtlnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.287076 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr" (OuterVolumeSpecName: "kube-api-access-9ddwr") pod "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" (UID: "a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0"). InnerVolumeSpecName "kube-api-access-9ddwr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379498 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ddwr\" (UniqueName: \"kubernetes.io/projected/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-kube-api-access-9ddwr\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379542 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379554 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtlnp\" (UniqueName: \"kubernetes.io/projected/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c-kube-api-access-jtlnp\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.379567 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607109 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607477 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607497 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607507 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607515 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607525 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607534 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607549 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607558 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607571 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607578 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607593 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="init" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607600 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="init" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607610 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607616 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: E0313 20:46:34.607628 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607634 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607792 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5980214a-6a36-4a9b-bb65-1ca2b979d0cc" containerName="dnsmasq-dns" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607805 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607818 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607828 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607838 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607849 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" containerName="mariadb-database-create" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.607860 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" containerName="mariadb-account-create-update" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.608337 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.610339 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.610787 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dwzcz" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.621684 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.728671 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-swgpr" event={"ID":"a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0","Type":"ContainerDied","Data":"9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411"} Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.728708 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a77cfdb7cfcd9f2075b2874e9f60e5294a75e734bb2cfb24f41c6b1f0d6d411" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.729069 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-swgpr" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.730145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-76eb-account-create-update-fsrb9" event={"ID":"0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c","Type":"ContainerDied","Data":"8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1"} Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.730179 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c1082d257c00a44403ddd4a85f3ca8dab669ff140745988c7896ffc01d4e1d1" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.730204 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-76eb-account-create-update-fsrb9" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785198 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.785833 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887368 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887420 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.887476 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.892234 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.892640 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.896267 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.907009 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"glance-db-sync-pshzp\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:34 crc kubenswrapper[4790]: I0313 20:46:34.984885 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:46:35 crc kubenswrapper[4790]: W0313 20:46:35.510826 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda93720f0_c882_49d8_bd56_7d77237da6e7.slice/crio-e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8 WatchSource:0}: Error finding container e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8: Status 404 returned error can't find the container with id e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8 Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.510926 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.754027 4790 generic.go:334] "Generic (PLEG): container finished" podID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerID="0b6241fd3bfe8fbe3b943719b842facbdec444bbd9bc9d23531d0137fa8a476f" exitCode=0 Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.754124 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerDied","Data":"0b6241fd3bfe8fbe3b943719b842facbdec444bbd9bc9d23531d0137fa8a476f"} Mar 13 20:46:35 crc kubenswrapper[4790]: I0313 20:46:35.757806 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerStarted","Data":"e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8"} Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.101823 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225104 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225172 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225741 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.225797 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226163 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226226 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.226346 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") pod \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\" (UID: \"b4ea3695-dddc-48fe-bdb6-eb0450c697c4\") " Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.227356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.227601 4790 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.227774 4790 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.230799 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq" (OuterVolumeSpecName: "kube-api-access-chgfq") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "kube-api-access-chgfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.245498 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts" (OuterVolumeSpecName: "scripts") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.246090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.250361 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.252459 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b4ea3695-dddc-48fe-bdb6-eb0450c697c4" (UID: "b4ea3695-dddc-48fe-bdb6-eb0450c697c4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329790 4790 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329828 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chgfq\" (UniqueName: \"kubernetes.io/projected/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-kube-api-access-chgfq\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329842 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329854 4790 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.329865 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b4ea3695-dddc-48fe-bdb6-eb0450c697c4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.586132 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:46:37 crc kubenswrapper[4790]: E0313 20:46:37.586737 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerName="swift-ring-rebalance" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.586781 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerName="swift-ring-rebalance" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.587027 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4ea3695-dddc-48fe-bdb6-eb0450c697c4" containerName="swift-ring-rebalance" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.587607 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.590996 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.593197 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.736017 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.736402 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.774913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-dv686" event={"ID":"b4ea3695-dddc-48fe-bdb6-eb0450c697c4","Type":"ContainerDied","Data":"e3681864143fdf49c4108aa2fae3bb58046cc42f144960f544964a65dc7f5591"} Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.774982 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-dv686" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.774988 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e3681864143fdf49c4108aa2fae3bb58046cc42f144960f544964a65dc7f5591" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.837591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.837719 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.838477 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.855051 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"root-account-create-update-fjjbp\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:37 crc kubenswrapper[4790]: I0313 20:46:37.909168 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.114611 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.345560 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.352249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/529b41ec-f1ee-432c-ac41-6957e1809aaa-etc-swift\") pod \"swift-storage-0\" (UID: \"529b41ec-f1ee-432c-ac41-6957e1809aaa\") " pod="openstack/swift-storage-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.381514 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:46:38 crc kubenswrapper[4790]: W0313 20:46:38.384574 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfa975ed_d42b_43be_91a1_4a2288005883.slice/crio-3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099 WatchSource:0}: Error finding container 3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099: Status 404 returned error can't find the container with id 3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099 Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.405462 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.784157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerStarted","Data":"3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099"} Mar 13 20:46:38 crc kubenswrapper[4790]: I0313 20:46:38.957530 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 13 20:46:38 crc kubenswrapper[4790]: W0313 20:46:38.965113 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod529b41ec_f1ee_432c_ac41_6957e1809aaa.slice/crio-9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021 WatchSource:0}: Error finding container 9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021: Status 404 returned error can't find the container with id 9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021 Mar 13 20:46:39 crc kubenswrapper[4790]: I0313 20:46:39.793241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"9415bf0f3992a52cf1a79ff736ac04bd686941b9e09ae8cebf4afd6163b8f021"} Mar 13 20:46:40 crc kubenswrapper[4790]: I0313 20:46:40.803966 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerStarted","Data":"caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba"} Mar 13 20:46:40 crc kubenswrapper[4790]: I0313 20:46:40.822808 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-fjjbp" podStartSLOduration=3.822785085 podStartE2EDuration="3.822785085s" podCreationTimestamp="2026-03-13 20:46:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:46:40.817282715 +0000 UTC m=+1131.838398606" watchObservedRunningTime="2026-03-13 20:46:40.822785085 +0000 UTC m=+1131.843900976" Mar 13 20:46:41 crc kubenswrapper[4790]: I0313 20:46:41.814344 4790 generic.go:334] "Generic (PLEG): container finished" podID="cfa975ed-d42b-43be-91a1-4a2288005883" containerID="caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba" exitCode=0 Mar 13 20:46:41 crc kubenswrapper[4790]: I0313 20:46:41.814415 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerDied","Data":"caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba"} Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.831258 4790 generic.go:334] "Generic (PLEG): container finished" podID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" exitCode=0 Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.831373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerDied","Data":"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde"} Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.834236 4790 generic.go:334] "Generic (PLEG): container finished" podID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerID="a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040" exitCode=0 Mar 13 20:46:43 crc kubenswrapper[4790]: I0313 20:46:43.834277 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerDied","Data":"a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040"} Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.593285 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-vspq5" podUID="c72ac557-7882-4120-b64a-4343639cc766" containerName="ovn-controller" probeResult="failure" output=< Mar 13 20:46:44 crc kubenswrapper[4790]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 13 20:46:44 crc kubenswrapper[4790]: > Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.685134 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.686372 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-k7bzr" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.908202 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.909759 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.911841 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 13 20:46:44 crc kubenswrapper[4790]: I0313 20:46:44.936043 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066698 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066779 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066841 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066947 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.066983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.168580 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.168868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.168932 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169255 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169356 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169367 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.169634 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.171636 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.187621 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"ovn-controller-vspq5-config-bhht7\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:45 crc kubenswrapper[4790]: I0313 20:46:45.231657 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.168120 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.322841 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") pod \"cfa975ed-d42b-43be-91a1-4a2288005883\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.323030 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") pod \"cfa975ed-d42b-43be-91a1-4a2288005883\" (UID: \"cfa975ed-d42b-43be-91a1-4a2288005883\") " Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.323832 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfa975ed-d42b-43be-91a1-4a2288005883" (UID: "cfa975ed-d42b-43be-91a1-4a2288005883"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.326960 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6" (OuterVolumeSpecName: "kube-api-access-jhbt6") pod "cfa975ed-d42b-43be-91a1-4a2288005883" (UID: "cfa975ed-d42b-43be-91a1-4a2288005883"). InnerVolumeSpecName "kube-api-access-jhbt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.425113 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfa975ed-d42b-43be-91a1-4a2288005883-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.425150 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbt6\" (UniqueName: \"kubernetes.io/projected/cfa975ed-d42b-43be-91a1-4a2288005883-kube-api-access-jhbt6\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.897487 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fjjbp" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.897486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fjjbp" event={"ID":"cfa975ed-d42b-43be-91a1-4a2288005883","Type":"ContainerDied","Data":"3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.898269 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bf57dd5b57dc7d65b3f15779be0b8a3e858bcc1e8b44b3a939aa707c5b1c099" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.903966 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:48 crc kubenswrapper[4790]: W0313 20:46:48.914969 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40a4a31a_66d7_491c_bb0a_e2ec83fc928c.slice/crio-1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb WatchSource:0}: Error finding container 1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb: Status 404 returned error can't find the container with id 1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.917518 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"bdc973933e06ab9bcb50133f253fc70761af872f58b342af7472b9f35c74b873"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.917562 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"46594b4458d0348ffb2e7b5e3a1c9038df2cfbe6502cd7cc99631f58654fcca1"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.921405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerStarted","Data":"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.921624 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.926920 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerStarted","Data":"9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4"} Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.927142 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.950242 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=54.832492007 podStartE2EDuration="1m3.950223587s" podCreationTimestamp="2026-03-13 20:45:45 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.876068302 +0000 UTC m=+1089.897184183" lastFinishedPulling="2026-03-13 20:46:07.993799872 +0000 UTC m=+1099.014915763" observedRunningTime="2026-03-13 20:46:48.944013978 +0000 UTC m=+1139.965129879" watchObservedRunningTime="2026-03-13 20:46:48.950223587 +0000 UTC m=+1139.971339478" Mar 13 20:46:48 crc kubenswrapper[4790]: I0313 20:46:48.980818 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=54.447017916 podStartE2EDuration="1m4.980796479s" podCreationTimestamp="2026-03-13 20:45:44 +0000 UTC" firstStartedPulling="2026-03-13 20:45:58.906120459 +0000 UTC m=+1089.927236350" lastFinishedPulling="2026-03-13 20:46:09.439899022 +0000 UTC m=+1100.461014913" observedRunningTime="2026-03-13 20:46:48.971555827 +0000 UTC m=+1139.992671718" watchObservedRunningTime="2026-03-13 20:46:48.980796479 +0000 UTC m=+1140.001912370" Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.611539 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-vspq5" Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.936354 4790 generic.go:334] "Generic (PLEG): container finished" podID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerID="08d59d9ecbc8376b9de39bc3a93a8ca2a0b84d09598e5daa63ce7fe053fdaadf" exitCode=0 Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.936720 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5-config-bhht7" event={"ID":"40a4a31a-66d7-491c-bb0a-e2ec83fc928c","Type":"ContainerDied","Data":"08d59d9ecbc8376b9de39bc3a93a8ca2a0b84d09598e5daa63ce7fe053fdaadf"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.936751 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5-config-bhht7" event={"ID":"40a4a31a-66d7-491c-bb0a-e2ec83fc928c","Type":"ContainerStarted","Data":"1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.938674 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerStarted","Data":"7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.941309 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"6365d7f281748fda6e37e1322ae1ca8277b8843c96ec9e7cceef6dfbf552801d"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.941368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"a0d6a9f8a9598434b5c6c515c3cac18ae26d99e999a72ce20259db06fffa438e"} Mar 13 20:46:49 crc kubenswrapper[4790]: I0313 20:46:49.973531 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-pshzp" podStartSLOduration=3.014532408 podStartE2EDuration="15.973513984s" podCreationTimestamp="2026-03-13 20:46:34 +0000 UTC" firstStartedPulling="2026-03-13 20:46:35.51366368 +0000 UTC m=+1126.534779571" lastFinishedPulling="2026-03-13 20:46:48.472645256 +0000 UTC m=+1139.493761147" observedRunningTime="2026-03-13 20:46:49.968334513 +0000 UTC m=+1140.989450404" watchObservedRunningTime="2026-03-13 20:46:49.973513984 +0000 UTC m=+1140.994629875" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.835804 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.881799 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.881964 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882000 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882026 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882058 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.882115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") pod \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\" (UID: \"40a4a31a-66d7-491c-bb0a-e2ec83fc928c\") " Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883241 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883284 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883474 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run" (OuterVolumeSpecName: "var-run") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.883731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts" (OuterVolumeSpecName: "scripts") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.884347 4790 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.884468 4790 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.884541 4790 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.900165 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb" (OuterVolumeSpecName: "kube-api-access-5kzsb") pod "40a4a31a-66d7-491c-bb0a-e2ec83fc928c" (UID: "40a4a31a-66d7-491c-bb0a-e2ec83fc928c"). InnerVolumeSpecName "kube-api-access-5kzsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.956200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-vspq5-config-bhht7" event={"ID":"40a4a31a-66d7-491c-bb0a-e2ec83fc928c","Type":"ContainerDied","Data":"1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb"} Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.956240 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1048379e8efcb844f06a2b362f4b322af4d288c96284f1a7dec3fca8b68074eb" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.956304 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-vspq5-config-bhht7" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.985950 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.985987 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kzsb\" (UniqueName: \"kubernetes.io/projected/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-kube-api-access-5kzsb\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:51 crc kubenswrapper[4790]: I0313 20:46:51.986001 4790 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40a4a31a-66d7-491c-bb0a-e2ec83fc928c-var-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.957274 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"017619036f9ca6f810c213f0b3de8f4ea5c090f4dc5c6ff264835b9dcad3c6bb"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968314 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"73fef00ea772029c19cf3c552184e3397b7e3431ac12bdeceb7ae46545b3ba70"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968325 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"7f67885f1d32551fd88f2f0c1a892a533f5cd15e4c6f0ce76ad73176eced75dc"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.968333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"20e79c8e864aea43f16daa3be101f4b3c472f442b10edd567b174e0ccc4d9ee4"} Mar 13 20:46:52 crc kubenswrapper[4790]: I0313 20:46:52.971783 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-vspq5-config-bhht7"] Mar 13 20:46:53 crc kubenswrapper[4790]: I0313 20:46:53.676968 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" path="/var/lib/kubelet/pods/40a4a31a-66d7-491c-bb0a-e2ec83fc928c/volumes" Mar 13 20:46:55 crc kubenswrapper[4790]: I0313 20:46:55.992350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"3f7833024195dd324060c218c2a3f5fb900ef53a89d5be73c310f68e13708b86"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.020871 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"9b444b8eca824e8f01844c07a099fab88a2db4ebaa62fe1ed779ac33cb699fe5"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"d11eadc1c5b3bb1ddb9981d59d820dadfe7f934bbdd45f8604fbabb2a3ea57bd"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021294 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"5d7b94ae5d4e238cb2860ed3a1e775bc3f7b10e31bbba0b31422281bc140b6d4"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"3d43b5ff0c1cf6471f540894b4d67fb592205c26dcf5cb8c5a0070409f1c3446"} Mar 13 20:46:57 crc kubenswrapper[4790]: I0313 20:46:57.021320 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"b2f6c7637a8cfb3deb39061dcc37aa3239822aceb779db6f1dfa5f460bdf0e54"} Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.033893 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"529b41ec-f1ee-432c-ac41-6957e1809aaa","Type":"ContainerStarted","Data":"bc32d534140b7118dc53dd1912c614b1d4f3abe74dd799cbe78cd522d23613ae"} Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.097844 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.445998070999998 podStartE2EDuration="37.097825402s" podCreationTimestamp="2026-03-13 20:46:21 +0000 UTC" firstStartedPulling="2026-03-13 20:46:38.967203199 +0000 UTC m=+1129.988319090" lastFinishedPulling="2026-03-13 20:46:55.61903053 +0000 UTC m=+1146.640146421" observedRunningTime="2026-03-13 20:46:58.096530356 +0000 UTC m=+1149.117646257" watchObservedRunningTime="2026-03-13 20:46:58.097825402 +0000 UTC m=+1149.118941293" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.380545 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:46:58 crc kubenswrapper[4790]: E0313 20:46:58.380964 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" containerName="mariadb-account-create-update" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.380985 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" containerName="mariadb-account-create-update" Mar 13 20:46:58 crc kubenswrapper[4790]: E0313 20:46:58.381006 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerName="ovn-config" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.381012 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerName="ovn-config" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.381173 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" containerName="mariadb-account-create-update" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.381198 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="40a4a31a-66d7-491c-bb0a-e2ec83fc928c" containerName="ovn-config" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.382074 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.384617 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.396642 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487890 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487910 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.487926 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.488161 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.488245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589488 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589572 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589596 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.589637 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.590716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.590724 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.590928 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.591335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.591461 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.609495 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"dnsmasq-dns-764c5664d7-kgfxm\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:58 crc kubenswrapper[4790]: I0313 20:46:58.712438 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:46:59 crc kubenswrapper[4790]: I0313 20:46:59.042878 4790 generic.go:334] "Generic (PLEG): container finished" podID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerID="7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c" exitCode=0 Mar 13 20:46:59 crc kubenswrapper[4790]: I0313 20:46:59.042935 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerDied","Data":"7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c"} Mar 13 20:46:59 crc kubenswrapper[4790]: W0313 20:46:59.206268 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37649c3b_ff5a_4ec3_a118_6a35d72bb4a2.slice/crio-f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40 WatchSource:0}: Error finding container f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40: Status 404 returned error can't find the container with id f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40 Mar 13 20:46:59 crc kubenswrapper[4790]: I0313 20:46:59.217481 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.051429 4790 generic.go:334] "Generic (PLEG): container finished" podID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" exitCode=0 Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.051549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerDied","Data":"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300"} Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.051652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerStarted","Data":"f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40"} Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.464965 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521010 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521062 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.521278 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") pod \"a93720f0-c882-49d8-bd56-7d77237da6e7\" (UID: \"a93720f0-c882-49d8-bd56-7d77237da6e7\") " Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.525927 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.529137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9" (OuterVolumeSpecName: "kube-api-access-f7mv9") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "kube-api-access-f7mv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.544174 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.562007 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data" (OuterVolumeSpecName: "config-data") pod "a93720f0-c882-49d8-bd56-7d77237da6e7" (UID: "a93720f0-c882-49d8-bd56-7d77237da6e7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622765 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622797 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622812 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7mv9\" (UniqueName: \"kubernetes.io/projected/a93720f0-c882-49d8-bd56-7d77237da6e7-kube-api-access-f7mv9\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:00 crc kubenswrapper[4790]: I0313 20:47:00.622820 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a93720f0-c882-49d8-bd56-7d77237da6e7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.061504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-pshzp" event={"ID":"a93720f0-c882-49d8-bd56-7d77237da6e7","Type":"ContainerDied","Data":"e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8"} Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.061613 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e64d851bd686920edd764cd24360ad64b0e4d7ace08c6bc76c9a4b613130fbe8" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.061648 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-pshzp" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.066493 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerStarted","Data":"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8"} Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.066670 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.105534 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" podStartSLOduration=3.10551271 podStartE2EDuration="3.10551271s" podCreationTimestamp="2026-03-13 20:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:01.091336034 +0000 UTC m=+1152.112451925" watchObservedRunningTime="2026-03-13 20:47:01.10551271 +0000 UTC m=+1152.126628601" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.420589 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.451716 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:01 crc kubenswrapper[4790]: E0313 20:47:01.454898 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerName="glance-db-sync" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.454938 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerName="glance-db-sync" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.455152 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" containerName="glance-db-sync" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.456603 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.471044 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538405 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538539 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538839 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538940 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.538972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640344 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640369 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640421 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.640444 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641264 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641731 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.641809 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.658255 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"dnsmasq-dns-74f6bcbc87-v8dxb\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:01 crc kubenswrapper[4790]: I0313 20:47:01.774592 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.082302 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" containerID="cri-o://650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" gracePeriod=10 Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.175645 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:03 crc kubenswrapper[4790]: W0313 20:47:03.192481 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34d41874_8dfa_4e3d_9298_d027a3e3c921.slice/crio-deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8 WatchSource:0}: Error finding container deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8: Status 404 returned error can't find the container with id deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8 Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.624886 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.679947 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680129 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680178 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.680320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") pod \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\" (UID: \"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2\") " Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.687140 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b" (OuterVolumeSpecName: "kube-api-access-mnz2b") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "kube-api-access-mnz2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.727194 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config" (OuterVolumeSpecName: "config") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.729907 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.734687 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.735954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.737325 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" (UID: "37649c3b-ff5a-4ec3-a118-6a35d72bb4a2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781753 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781786 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781797 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnz2b\" (UniqueName: \"kubernetes.io/projected/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-kube-api-access-mnz2b\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781806 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781814 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:03 crc kubenswrapper[4790]: I0313 20:47:03.781822 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.090039 4790 generic.go:334] "Generic (PLEG): container finished" podID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerID="03a87f5d6c3388f53ac8b07b4a8345caa059485eb6f71dad3953ac168c0ce643" exitCode=0 Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.090131 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerDied","Data":"03a87f5d6c3388f53ac8b07b4a8345caa059485eb6f71dad3953ac168c0ce643"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.092183 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerStarted","Data":"deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099663 4790 generic.go:334] "Generic (PLEG): container finished" podID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" exitCode=0 Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099712 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerDied","Data":"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099747 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" event={"ID":"37649c3b-ff5a-4ec3-a118-6a35d72bb4a2","Type":"ContainerDied","Data":"f2ee2aebbd470d658512f56eb6204c2f2beaa533895f3a788189a41f33cd8f40"} Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.099766 4790 scope.go:117] "RemoveContainer" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.108856 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-764c5664d7-kgfxm" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.151837 4790 scope.go:117] "RemoveContainer" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.165410 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.171828 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-764c5664d7-kgfxm"] Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.183183 4790 scope.go:117] "RemoveContainer" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" Mar 13 20:47:04 crc kubenswrapper[4790]: E0313 20:47:04.183616 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8\": container with ID starting with 650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8 not found: ID does not exist" containerID="650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.183668 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8"} err="failed to get container status \"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8\": rpc error: code = NotFound desc = could not find container \"650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8\": container with ID starting with 650b5808006b140c4875cb2a392ded630e2236906ea44d9599dca484a8a47bf8 not found: ID does not exist" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.183700 4790 scope.go:117] "RemoveContainer" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" Mar 13 20:47:04 crc kubenswrapper[4790]: E0313 20:47:04.188891 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300\": container with ID starting with 5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300 not found: ID does not exist" containerID="5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300" Mar 13 20:47:04 crc kubenswrapper[4790]: I0313 20:47:04.188941 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300"} err="failed to get container status \"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300\": rpc error: code = NotFound desc = could not find container \"5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300\": container with ID starting with 5162dfca40a2769b55fd2264df73ec279e1c28b641cba843385959879d6e1300 not found: ID does not exist" Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.109008 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerStarted","Data":"267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942"} Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.110842 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.137083 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podStartSLOduration=4.137063481 podStartE2EDuration="4.137063481s" podCreationTimestamp="2026-03-13 20:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:05.130129352 +0000 UTC m=+1156.151245243" watchObservedRunningTime="2026-03-13 20:47:05.137063481 +0000 UTC m=+1156.158179372" Mar 13 20:47:05 crc kubenswrapper[4790]: I0313 20:47:05.669574 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" path="/var/lib/kubelet/pods/37649c3b-ff5a-4ec3-a118-6a35d72bb4a2/volumes" Mar 13 20:47:06 crc kubenswrapper[4790]: I0313 20:47:06.410661 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:47:06 crc kubenswrapper[4790]: I0313 20:47:06.521556 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.133184 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:47:08 crc kubenswrapper[4790]: E0313 20:47:08.134168 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="init" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.134229 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="init" Mar 13 20:47:08 crc kubenswrapper[4790]: E0313 20:47:08.134333 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.134402 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.134639 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37649c3b-ff5a-4ec3-a118-6a35d72bb4a2" containerName="dnsmasq-dns" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.135672 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.143879 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.164118 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.164233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.241585 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.242658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.246079 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.253208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.265629 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.265707 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.266444 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.287031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"cinder-db-create-56s96\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.336255 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.337559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.348773 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367238 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367275 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.367306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.444041 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.445448 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.448342 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.453277 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.457671 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.472947 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.472991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.473018 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.473096 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.473766 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.474597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.501468 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"barbican-db-create-f926w\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.501930 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"cinder-eae0-account-create-update-ljhjl\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.548077 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.549983 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.559839 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.570133 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.574233 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.574285 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.648203 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.649219 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.653554 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.657676 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.658811 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.659028 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.659162 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.659937 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.675311 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.676578 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683371 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.683566 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.685185 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.687836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.690504 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.712258 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"barbican-3bc0-account-create-update-ntn27\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.758868 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784497 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784562 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784629 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.784732 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.785944 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.811980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"neutron-db-create-4p54c\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.880090 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.885860 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.885966 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.885991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.886032 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.886063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.887126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.890038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.891263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.903851 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"neutron-4d80-account-create-update-7trkt\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.904819 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"keystone-db-sync-jf9fb\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:08 crc kubenswrapper[4790]: I0313 20:47:08.970053 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.002454 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.008750 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.152058 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56s96" event={"ID":"8e11abfd-7d59-479b-9f77-cbbd22cbf48c","Type":"ContainerStarted","Data":"5b4afab25af66e7d81a8a8f191da40314dafdae52dadeffca7f74035d3ce1c8a"} Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.210753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:47:09 crc kubenswrapper[4790]: W0313 20:47:09.225929 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1dd76b06_ea34_4044_bba0_cf5e6e822b6b.slice/crio-69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480 WatchSource:0}: Error finding container 69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480: Status 404 returned error can't find the container with id 69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480 Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.303413 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.449956 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:47:09 crc kubenswrapper[4790]: W0313 20:47:09.460839 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcfc09f48_1b0c_45fe_be9b_8bf3a3af887c.slice/crio-2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9 WatchSource:0}: Error finding container 2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9: Status 404 returned error can't find the container with id 2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9 Mar 13 20:47:09 crc kubenswrapper[4790]: W0313 20:47:09.464613 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode551be1a_728e_4851_894c_30b4493326d6.slice/crio-fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2 WatchSource:0}: Error finding container fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2: Status 404 returned error can't find the container with id fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2 Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.506591 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.607842 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:47:09 crc kubenswrapper[4790]: I0313 20:47:09.615272 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.162557 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerStarted","Data":"02abbfffc75d71b3a16f41eb48aa257f23b07611d502300367b9d004460a9261"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.163798 4790 generic.go:334] "Generic (PLEG): container finished" podID="e551be1a-728e-4851-894c-30b4493326d6" containerID="047c96b0959e792e896cbcb062d30482e777ac7ce2334a4427efe91c5a39d9a3" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.163851 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4p54c" event={"ID":"e551be1a-728e-4851-894c-30b4493326d6","Type":"ContainerDied","Data":"047c96b0959e792e896cbcb062d30482e777ac7ce2334a4427efe91c5a39d9a3"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.163870 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4p54c" event={"ID":"e551be1a-728e-4851-894c-30b4493326d6","Type":"ContainerStarted","Data":"fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.168680 4790 generic.go:334] "Generic (PLEG): container finished" podID="fc51014c-323e-4a6b-9202-edc7b135809d" containerID="7f1ca4be311e4bf8899acd7ffc7b40f8dd562b652669b076fe646ca2df5ae15e" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.168772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d80-account-create-update-7trkt" event={"ID":"fc51014c-323e-4a6b-9202-edc7b135809d","Type":"ContainerDied","Data":"7f1ca4be311e4bf8899acd7ffc7b40f8dd562b652669b076fe646ca2df5ae15e"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.168797 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d80-account-create-update-7trkt" event={"ID":"fc51014c-323e-4a6b-9202-edc7b135809d","Type":"ContainerStarted","Data":"87a437edea37085830d79b3968bbcf7a8e5b2600d9fbf0f8b597555054038d5f"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.181730 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerID="36f3978e6e158babd7d1c6c18b804e801c1d5a860c6298e1e465b9030818d00c" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.181820 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56s96" event={"ID":"8e11abfd-7d59-479b-9f77-cbbd22cbf48c","Type":"ContainerDied","Data":"36f3978e6e158babd7d1c6c18b804e801c1d5a860c6298e1e465b9030818d00c"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.189136 4790 generic.go:334] "Generic (PLEG): container finished" podID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerID="37311d8f14a45460392cc2657752fc09be6fc325071ebe0626eb04d799e80545" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.189209 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f926w" event={"ID":"1dd76b06-ea34-4044-bba0-cf5e6e822b6b","Type":"ContainerDied","Data":"37311d8f14a45460392cc2657752fc09be6fc325071ebe0626eb04d799e80545"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.189239 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f926w" event={"ID":"1dd76b06-ea34-4044-bba0-cf5e6e822b6b","Type":"ContainerStarted","Data":"69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.219411 4790 generic.go:334] "Generic (PLEG): container finished" podID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerID="fb829732267d5d36436612626f2036bb0698b4bd86f5c88383f3ee7aba396142" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.219520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3bc0-account-create-update-ntn27" event={"ID":"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c","Type":"ContainerDied","Data":"fb829732267d5d36436612626f2036bb0698b4bd86f5c88383f3ee7aba396142"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.219563 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3bc0-account-create-update-ntn27" event={"ID":"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c","Type":"ContainerStarted","Data":"2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.260849 4790 generic.go:334] "Generic (PLEG): container finished" podID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerID="4f445f85254948b2a82910d93997f50d41021103d40e52ebd6447aec6a71de39" exitCode=0 Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.260913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eae0-account-create-update-ljhjl" event={"ID":"e7d496eb-3f17-4e7b-9a68-c91dec27355a","Type":"ContainerDied","Data":"4f445f85254948b2a82910d93997f50d41021103d40e52ebd6447aec6a71de39"} Mar 13 20:47:10 crc kubenswrapper[4790]: I0313 20:47:10.260937 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eae0-account-create-update-ljhjl" event={"ID":"e7d496eb-3f17-4e7b-9a68-c91dec27355a","Type":"ContainerStarted","Data":"8ef7747d9ef2dbad4cef572c11db32bad3072a3edb1b59e19bb36fd7f24b5297"} Mar 13 20:47:11 crc kubenswrapper[4790]: I0313 20:47:11.776211 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:11 crc kubenswrapper[4790]: I0313 20:47:11.901313 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:47:11 crc kubenswrapper[4790]: I0313 20:47:11.901605 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-gv56q" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" containerID="cri-o://e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb" gracePeriod=10 Mar 13 20:47:12 crc kubenswrapper[4790]: I0313 20:47:12.282137 4790 generic.go:334] "Generic (PLEG): container finished" podID="d798b6d8-8c2b-4827-81d3-09177054591f" containerID="e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb" exitCode=0 Mar 13 20:47:12 crc kubenswrapper[4790]: I0313 20:47:12.282216 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerDied","Data":"e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb"} Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.015335 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.015425 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.957024 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:14 crc kubenswrapper[4790]: I0313 20:47:14.965204 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") pod \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012834 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") pod \"e551be1a-728e-4851-894c-30b4493326d6\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012946 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") pod \"e551be1a-728e-4851-894c-30b4493326d6\" (UID: \"e551be1a-728e-4851-894c-30b4493326d6\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.012998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") pod \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\" (UID: \"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.013108 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" (UID: "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.013240 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e551be1a-728e-4851-894c-30b4493326d6" (UID: "e551be1a-728e-4851-894c-30b4493326d6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.013995 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e551be1a-728e-4851-894c-30b4493326d6-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.014012 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.017707 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h" (OuterVolumeSpecName: "kube-api-access-r4q5h") pod "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" (UID: "cfc09f48-1b0c-45fe-be9b-8bf3a3af887c"). InnerVolumeSpecName "kube-api-access-r4q5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.017818 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb" (OuterVolumeSpecName: "kube-api-access-45trb") pod "e551be1a-728e-4851-894c-30b4493326d6" (UID: "e551be1a-728e-4851-894c-30b4493326d6"). InnerVolumeSpecName "kube-api-access-45trb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.026254 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.036597 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.048888 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.064759 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.076313 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") pod \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114599 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") pod \"fc51014c-323e-4a6b-9202-edc7b135809d\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114621 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") pod \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114668 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") pod \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\" (UID: \"8e11abfd-7d59-479b-9f77-cbbd22cbf48c\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114698 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") pod \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\" (UID: \"1dd76b06-ea34-4044-bba0-cf5e6e822b6b\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114788 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") pod \"fc51014c-323e-4a6b-9202-edc7b135809d\" (UID: \"fc51014c-323e-4a6b-9202-edc7b135809d\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114837 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") pod \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.114874 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") pod \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\" (UID: \"e7d496eb-3f17-4e7b-9a68-c91dec27355a\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.115221 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45trb\" (UniqueName: \"kubernetes.io/projected/e551be1a-728e-4851-894c-30b4493326d6-kube-api-access-45trb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.115246 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4q5h\" (UniqueName: \"kubernetes.io/projected/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c-kube-api-access-r4q5h\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.119086 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm" (OuterVolumeSpecName: "kube-api-access-b5hkm") pod "e7d496eb-3f17-4e7b-9a68-c91dec27355a" (UID: "e7d496eb-3f17-4e7b-9a68-c91dec27355a"). InnerVolumeSpecName "kube-api-access-b5hkm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.127195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n" (OuterVolumeSpecName: "kube-api-access-6vj2n") pod "1dd76b06-ea34-4044-bba0-cf5e6e822b6b" (UID: "1dd76b06-ea34-4044-bba0-cf5e6e822b6b"). InnerVolumeSpecName "kube-api-access-6vj2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.127861 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8e11abfd-7d59-479b-9f77-cbbd22cbf48c" (UID: "8e11abfd-7d59-479b-9f77-cbbd22cbf48c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.128355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1dd76b06-ea34-4044-bba0-cf5e6e822b6b" (UID: "1dd76b06-ea34-4044-bba0-cf5e6e822b6b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.128787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fc51014c-323e-4a6b-9202-edc7b135809d" (UID: "fc51014c-323e-4a6b-9202-edc7b135809d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.129578 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e7d496eb-3f17-4e7b-9a68-c91dec27355a" (UID: "e7d496eb-3f17-4e7b-9a68-c91dec27355a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.131395 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj" (OuterVolumeSpecName: "kube-api-access-sd5fj") pod "8e11abfd-7d59-479b-9f77-cbbd22cbf48c" (UID: "8e11abfd-7d59-479b-9f77-cbbd22cbf48c"). InnerVolumeSpecName "kube-api-access-sd5fj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.134949 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7" (OuterVolumeSpecName: "kube-api-access-t2wj7") pod "fc51014c-323e-4a6b-9202-edc7b135809d" (UID: "fc51014c-323e-4a6b-9202-edc7b135809d"). InnerVolumeSpecName "kube-api-access-t2wj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.215923 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216019 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216047 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216088 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216147 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") pod \"d798b6d8-8c2b-4827-81d3-09177054591f\" (UID: \"d798b6d8-8c2b-4827-81d3-09177054591f\") " Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216564 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vj2n\" (UniqueName: \"kubernetes.io/projected/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-kube-api-access-6vj2n\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216586 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2wj7\" (UniqueName: \"kubernetes.io/projected/fc51014c-323e-4a6b-9202-edc7b135809d-kube-api-access-t2wj7\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216595 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd5fj\" (UniqueName: \"kubernetes.io/projected/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-kube-api-access-sd5fj\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216779 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8e11abfd-7d59-479b-9f77-cbbd22cbf48c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216790 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1dd76b06-ea34-4044-bba0-cf5e6e822b6b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216801 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fc51014c-323e-4a6b-9202-edc7b135809d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216820 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e7d496eb-3f17-4e7b-9a68-c91dec27355a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.216833 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5hkm\" (UniqueName: \"kubernetes.io/projected/e7d496eb-3f17-4e7b-9a68-c91dec27355a-kube-api-access-b5hkm\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.219876 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q" (OuterVolumeSpecName: "kube-api-access-bmp6q") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "kube-api-access-bmp6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.256489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.258476 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.260630 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config" (OuterVolumeSpecName: "config") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.261589 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d798b6d8-8c2b-4827-81d3-09177054591f" (UID: "d798b6d8-8c2b-4827-81d3-09177054591f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.305131 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-56s96" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.305293 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-56s96" event={"ID":"8e11abfd-7d59-479b-9f77-cbbd22cbf48c","Type":"ContainerDied","Data":"5b4afab25af66e7d81a8a8f191da40314dafdae52dadeffca7f74035d3ce1c8a"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.305347 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b4afab25af66e7d81a8a8f191da40314dafdae52dadeffca7f74035d3ce1c8a" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.306859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-gv56q" event={"ID":"d798b6d8-8c2b-4827-81d3-09177054591f","Type":"ContainerDied","Data":"c3c4a9ad42afcf8df041c9c0555750547f59bfe84f27594765b42944074e50c3"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.306896 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-gv56q" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.306933 4790 scope.go:117] "RemoveContainer" containerID="e8757b5e6f39b607b4f89f7c3ecb73428b1e5ac3dca1607fb2f473649fb57fcb" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.309809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-f926w" event={"ID":"1dd76b06-ea34-4044-bba0-cf5e6e822b6b","Type":"ContainerDied","Data":"69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.309851 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-f926w" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.309853 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69eb03448cc715ed946241b6a35f2a90772b59bd462ea39afd9ab3efb2303480" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.311929 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3bc0-account-create-update-ntn27" event={"ID":"cfc09f48-1b0c-45fe-be9b-8bf3a3af887c","Type":"ContainerDied","Data":"2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.311963 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3bc0-account-create-update-ntn27" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.311972 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2f031028c701aab147e28455b17e92b1df62872441dc58d22a17cfab01dd04b9" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.317491 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-eae0-account-create-update-ljhjl" event={"ID":"e7d496eb-3f17-4e7b-9a68-c91dec27355a","Type":"ContainerDied","Data":"8ef7747d9ef2dbad4cef572c11db32bad3072a3edb1b59e19bb36fd7f24b5297"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.317562 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-eae0-account-create-update-ljhjl" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.317570 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8ef7747d9ef2dbad4cef572c11db32bad3072a3edb1b59e19bb36fd7f24b5297" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318466 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318480 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318489 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318497 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmp6q\" (UniqueName: \"kubernetes.io/projected/d798b6d8-8c2b-4827-81d3-09177054591f-kube-api-access-bmp6q\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.318506 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d798b6d8-8c2b-4827-81d3-09177054591f-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.325690 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerStarted","Data":"5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.330470 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4p54c" event={"ID":"e551be1a-728e-4851-894c-30b4493326d6","Type":"ContainerDied","Data":"fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.330526 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcb11536e1613f8250dbcb4276acec420f284088247038bb304099f68d9aabb2" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.330485 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4p54c" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.337031 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-4d80-account-create-update-7trkt" event={"ID":"fc51014c-323e-4a6b-9202-edc7b135809d","Type":"ContainerDied","Data":"87a437edea37085830d79b3968bbcf7a8e5b2600d9fbf0f8b597555054038d5f"} Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.337066 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87a437edea37085830d79b3968bbcf7a8e5b2600d9fbf0f8b597555054038d5f" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.337149 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-4d80-account-create-update-7trkt" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.343108 4790 scope.go:117] "RemoveContainer" containerID="978e68813566a9c04dd155a064a373e2649857a2eefbe05ca9b8949d3e9db280" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.349159 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.358765 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-gv56q"] Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.365823 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-jf9fb" podStartSLOduration=2.219555691 podStartE2EDuration="7.365808466s" podCreationTimestamp="2026-03-13 20:47:08 +0000 UTC" firstStartedPulling="2026-03-13 20:47:09.620153525 +0000 UTC m=+1160.641269416" lastFinishedPulling="2026-03-13 20:47:14.76640629 +0000 UTC m=+1165.787522191" observedRunningTime="2026-03-13 20:47:15.35712908 +0000 UTC m=+1166.378244971" watchObservedRunningTime="2026-03-13 20:47:15.365808466 +0000 UTC m=+1166.386924357" Mar 13 20:47:15 crc kubenswrapper[4790]: I0313 20:47:15.675904 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" path="/var/lib/kubelet/pods/d798b6d8-8c2b-4827-81d3-09177054591f/volumes" Mar 13 20:47:18 crc kubenswrapper[4790]: I0313 20:47:18.362708 4790 generic.go:334] "Generic (PLEG): container finished" podID="4214f238-4044-45ab-8e40-48894500f25f" containerID="5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134" exitCode=0 Mar 13 20:47:18 crc kubenswrapper[4790]: I0313 20:47:18.362827 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerDied","Data":"5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134"} Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.748597 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.893319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") pod \"4214f238-4044-45ab-8e40-48894500f25f\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.893364 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") pod \"4214f238-4044-45ab-8e40-48894500f25f\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.893412 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") pod \"4214f238-4044-45ab-8e40-48894500f25f\" (UID: \"4214f238-4044-45ab-8e40-48894500f25f\") " Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.899142 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn" (OuterVolumeSpecName: "kube-api-access-pbwsn") pod "4214f238-4044-45ab-8e40-48894500f25f" (UID: "4214f238-4044-45ab-8e40-48894500f25f"). InnerVolumeSpecName "kube-api-access-pbwsn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.919022 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4214f238-4044-45ab-8e40-48894500f25f" (UID: "4214f238-4044-45ab-8e40-48894500f25f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.937257 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data" (OuterVolumeSpecName: "config-data") pod "4214f238-4044-45ab-8e40-48894500f25f" (UID: "4214f238-4044-45ab-8e40-48894500f25f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.995832 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.995873 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbwsn\" (UniqueName: \"kubernetes.io/projected/4214f238-4044-45ab-8e40-48894500f25f-kube-api-access-pbwsn\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:19 crc kubenswrapper[4790]: I0313 20:47:19.995889 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4214f238-4044-45ab-8e40-48894500f25f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.378388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-jf9fb" event={"ID":"4214f238-4044-45ab-8e40-48894500f25f","Type":"ContainerDied","Data":"02abbfffc75d71b3a16f41eb48aa257f23b07611d502300367b9d004460a9261"} Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.378426 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="02abbfffc75d71b3a16f41eb48aa257f23b07611d502300367b9d004460a9261" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.378474 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-jf9fb" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615068 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615473 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="init" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615494 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="init" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615507 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615515 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615530 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615538 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615559 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4214f238-4044-45ab-8e40-48894500f25f" containerName="keystone-db-sync" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615568 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4214f238-4044-45ab-8e40-48894500f25f" containerName="keystone-db-sync" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615582 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615589 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615604 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615612 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615625 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e551be1a-728e-4851-894c-30b4493326d6" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615632 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e551be1a-728e-4851-894c-30b4493326d6" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615649 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615656 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" Mar 13 20:47:20 crc kubenswrapper[4790]: E0313 20:47:20.615671 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615680 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615860 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615879 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615889 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615902 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4214f238-4044-45ab-8e40-48894500f25f" containerName="keystone-db-sync" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615930 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615944 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" containerName="mariadb-account-create-update" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615957 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d798b6d8-8c2b-4827-81d3-09177054591f" containerName="dnsmasq-dns" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.615967 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e551be1a-728e-4851-894c-30b4493326d6" containerName="mariadb-database-create" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.616971 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.629597 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.686139 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.687083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.690767 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691192 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691406 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691563 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.691703 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707463 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707500 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707603 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.707627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.719331 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809107 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809152 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809203 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809235 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.809677 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.810152 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.810815 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.810882 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.811041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.811182 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.811217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812205 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812262 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.812861 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.854937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"dnsmasq-dns-847c4cc679-87xrs\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.885547 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.886573 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.891584 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.892020 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6dm5h" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.892218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.907786 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.909455 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.914929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915002 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915039 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915115 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915167 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.915228 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.917928 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-697p5" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.918177 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.918312 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.918463 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.943897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.944000 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.945273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.946619 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.953366 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.953834 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.959905 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.978145 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.979247 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.984754 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9qb6s" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.985047 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 20:47:20 crc kubenswrapper[4790]: I0313 20:47:20.985210 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:20.999007 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"keystone-bootstrap-rfr4j\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.018068 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.018986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019024 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019044 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019079 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019142 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.019169 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.048453 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.069442 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.120889 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.120948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.120983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121013 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121080 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121202 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121241 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121265 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121304 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.121332 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.164652 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.164783 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.165465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.165916 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.182167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.189277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.206479 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.225960 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226367 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226656 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226804 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.226904 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.227058 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.232473 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.234582 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.235906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.243442 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.258357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.259811 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.260163 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.295802 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"neutron-db-sync-mg4xg\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.296202 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"horizon-58656c768f-spczn\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.309989 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.310607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"cinder-db-sync-g2nmn\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.326321 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328413 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328615 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.328916 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.329023 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.329126 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.329260 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.424013 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453696 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453927 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.453992 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.456584 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.459552 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.463577 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.476062 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.484038 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2zqc7" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486285 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486513 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486580 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.486598 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.500940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.500997 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.504357 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.513910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.518919 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.519146 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-dwzcz" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.519418 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.519578 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.520338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"ceilometer-0\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.538316 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.540043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.547858 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.548099 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.548219 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2nkvl" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559714 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559748 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559880 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559894 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.559925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.567243 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.597434 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.655946 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.656322 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663229 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663253 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663291 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663313 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663333 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663357 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663388 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663409 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663451 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.663467 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.671824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.674028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.675547 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.680623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.681956 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.682609 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.685083 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.689557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.691979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"barbican-db-sync-kkmzk\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.696647 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.698368 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.704447 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.737822 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.757161 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.758813 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765336 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765371 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.765510 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.795567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.827218 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.830010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.837192 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.842784 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.847966 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.851400 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.859698 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.864863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867397 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867423 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867490 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867621 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867648 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867691 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.867724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.868164 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.871460 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.872112 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.876009 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.877244 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.888469 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.909741 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.909813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"placement-db-sync-wbb8v\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.927422 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.969972 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970150 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970192 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970210 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970232 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970315 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970369 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970577 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970933 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.970962 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.971128 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.974571 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.975240 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.975820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.976658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.977745 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:21 crc kubenswrapper[4790]: I0313 20:47:21.997405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"dnsmasq-dns-785d8bcb8c-n8ckq\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.078671 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.078780 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.078809 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.080303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.082998 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.083063 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.085864 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.086518 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.086638 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.091693 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092246 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092329 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092405 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092442 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092465 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092480 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.092513 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.093304 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.098924 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.100291 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.100716 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.111318 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.112028 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"horizon-76485b6c5-pjfp4\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.115940 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.122589 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.138046 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.155447 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.193512 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.206046 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.208204 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.225608 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeef97bfb_4275_4a0a_bae4_5442cf7400dd.slice/crio-e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0 WatchSource:0}: Error finding container e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0: Status 404 returned error can't find the container with id e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0 Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.248655 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4f7f54d0_0f93_497b_b5cb_2a35d7dc68f6.slice/crio-5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a WatchSource:0}: Error finding container 5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a: Status 404 returned error can't find the container with id 5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.323969 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.339342 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32ffb609_7a3b_42b7_b513_7003deefe5dd.slice/crio-593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd WatchSource:0}: Error finding container 593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd: Status 404 returned error can't find the container with id 593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.368851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.399427 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.404690 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.427870 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1abdfade_817b_4659_b8be_48bb516fb866.slice/crio-3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6 WatchSource:0}: Error finding container 3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6: Status 404 returned error can't find the container with id 3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6 Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.434551 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerStarted","Data":"593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.435759 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58656c768f-spczn" event={"ID":"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6","Type":"ContainerStarted","Data":"5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.436986 4790 generic.go:334] "Generic (PLEG): container finished" podID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerID="0747b757edc09b2a27e6d814254501a0191a898c87d268e337ba01775251ef0e" exitCode=0 Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.437035 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" event={"ID":"016f2a5b-9c42-4f7b-bf5f-42eb5010b321","Type":"ContainerDied","Data":"0747b757edc09b2a27e6d814254501a0191a898c87d268e337ba01775251ef0e"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.437051 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" event={"ID":"016f2a5b-9c42-4f7b-bf5f-42eb5010b321","Type":"ContainerStarted","Data":"44f1620af41208c3efd1e9ed5400e4f2a150da135c8bb09cbfabc990bfdce7d6"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.439347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerStarted","Data":"be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.439600 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerStarted","Data":"f6dce01b5701bc8518d8b3503d0cb256f3f959202a6e0fa8b6a41a1cef8da1af"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.444892 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerStarted","Data":"e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0"} Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.501812 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rfr4j" podStartSLOduration=2.501787926 podStartE2EDuration="2.501787926s" podCreationTimestamp="2026-03-13 20:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:22.478914622 +0000 UTC m=+1173.500030513" watchObservedRunningTime="2026-03-13 20:47:22.501787926 +0000 UTC m=+1173.522903817" Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.515336 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.655687 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:47:22 crc kubenswrapper[4790]: W0313 20:47:22.657168 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dc42e6e_503c_4931_87e1_adcbf3469570.slice/crio-6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d WatchSource:0}: Error finding container 6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d: Status 404 returned error can't find the container with id 6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.664716 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.841091 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:47:22 crc kubenswrapper[4790]: I0313 20:47:22.996200 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.072351 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:23 crc kubenswrapper[4790]: W0313 20:47:23.074863 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podccb1b2e8_4b05_411b_a540_6507fdd5775f.slice/crio-0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898 WatchSource:0}: Error finding container 0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898: Status 404 returned error can't find the container with id 0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898 Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123296 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123358 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123640 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123752 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.123909 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") pod \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\" (UID: \"016f2a5b-9c42-4f7b-bf5f-42eb5010b321\") " Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.153170 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k" (OuterVolumeSpecName: "kube-api-access-qth2k") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "kube-api-access-qth2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.154038 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.154073 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.168074 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.176963 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.190728 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.208924 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config" (OuterVolumeSpecName: "config") pod "016f2a5b-9c42-4f7b-bf5f-42eb5010b321" (UID: "016f2a5b-9c42-4f7b-bf5f-42eb5010b321"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225754 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qth2k\" (UniqueName: \"kubernetes.io/projected/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-kube-api-access-qth2k\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225811 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225850 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225863 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225874 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.225885 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/016f2a5b-9c42-4f7b-bf5f-42eb5010b321-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.465352 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.499212 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerStarted","Data":"ae6303d4ad793ad0e64f0d47ea54d176c052de04140f8c58d197bb273bafc45e"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.544638 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.544915 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerID="365da7d7e0570f42f94165ed1add103834755db474d98891c1b86296bdc4478f" exitCode=0 Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.545003 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerDied","Data":"365da7d7e0570f42f94165ed1add103834755db474d98891c1b86296bdc4478f"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.545028 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerStarted","Data":"6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.559482 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.564169 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:23 crc kubenswrapper[4790]: E0313 20:47:23.567313 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerName="init" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.567335 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerName="init" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.567551 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" containerName="init" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.587850 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.595547 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.627138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" event={"ID":"016f2a5b-9c42-4f7b-bf5f-42eb5010b321","Type":"ContainerDied","Data":"44f1620af41208c3efd1e9ed5400e4f2a150da135c8bb09cbfabc990bfdce7d6"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.627239 4790 scope.go:117] "RemoveContainer" containerID="0747b757edc09b2a27e6d814254501a0191a898c87d268e337ba01775251ef0e" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.627446 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-847c4cc679-87xrs" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.655095 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerStarted","Data":"66bbb0c4358595b69723c41e22c295dc43c704b8a97a66ffe918a90a7b96cb73"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.661572 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.715288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.723729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerStarted","Data":"0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.733695 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerStarted","Data":"0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755306 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755347 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755430 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.755457 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.800869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerStarted","Data":"fea6afc911ca7e2dd3477729e74613955f874b4583f2a3acc5c3410afdd753e9"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.820299 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.834777 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerStarted","Data":"a21e9d9d91f185d12ea208152b90be684a6632b64a67e174026d61018c3b2d9d"} Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.856976 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-847c4cc679-87xrs"] Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857205 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857291 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857322 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.857923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.858764 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.860089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.865306 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-mg4xg" podStartSLOduration=3.865289165 podStartE2EDuration="3.865289165s" podCreationTimestamp="2026-03-13 20:47:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:23.772882454 +0000 UTC m=+1174.793998345" watchObservedRunningTime="2026-03-13 20:47:23.865289165 +0000 UTC m=+1174.886405046" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.884144 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.894784 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"horizon-54dbf7ffd5-z6rf5\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:23 crc kubenswrapper[4790]: I0313 20:47:23.947775 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.574838 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.854602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerStarted","Data":"f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.870323 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerStarted","Data":"a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.871043 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.873923 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerStarted","Data":"3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.876313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54dbf7ffd5-z6rf5" event={"ID":"bdc44913-44bd-4899-8f7b-d4908bad33c3","Type":"ContainerStarted","Data":"bb504ca44dce1509b49f6335cf8ea3f3e23e6aefb5e7712baa5b25d4cf19fcdc"} Mar 13 20:47:24 crc kubenswrapper[4790]: I0313 20:47:24.904187 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" podStartSLOduration=3.9041639999999997 podStartE2EDuration="3.904164s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:24.897819347 +0000 UTC m=+1175.918935268" watchObservedRunningTime="2026-03-13 20:47:24.904164 +0000 UTC m=+1175.925279901" Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.674278 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="016f2a5b-9c42-4f7b-bf5f-42eb5010b321" path="/var/lib/kubelet/pods/016f2a5b-9c42-4f7b-bf5f-42eb5010b321/volumes" Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.896891 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerStarted","Data":"15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24"} Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.897088 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" containerID="cri-o://3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.897651 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" containerID="cri-o://15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.901967 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" containerID="cri-o://f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.902052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerStarted","Data":"d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2"} Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.902299 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" containerID="cri-o://d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2" gracePeriod=30 Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.941581 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.9415629039999995 podStartE2EDuration="4.941562904s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:25.922236847 +0000 UTC m=+1176.943352738" watchObservedRunningTime="2026-03-13 20:47:25.941562904 +0000 UTC m=+1176.962678795" Mar 13 20:47:25 crc kubenswrapper[4790]: I0313 20:47:25.943268 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.943258261 podStartE2EDuration="4.943258261s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:25.939646141 +0000 UTC m=+1176.960762062" watchObservedRunningTime="2026-03-13 20:47:25.943258261 +0000 UTC m=+1176.964374152" Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919292 4790 generic.go:334] "Generic (PLEG): container finished" podID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerID="d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2" exitCode=0 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919651 4790 generic.go:334] "Generic (PLEG): container finished" podID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerID="f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8" exitCode=143 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerDied","Data":"d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2"} Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.919719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerDied","Data":"f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8"} Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927352 4790 generic.go:334] "Generic (PLEG): container finished" podID="9acddcdc-720a-469a-8023-7762f1b7c025" containerID="15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24" exitCode=0 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927398 4790 generic.go:334] "Generic (PLEG): container finished" podID="9acddcdc-720a-469a-8023-7762f1b7c025" containerID="3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193" exitCode=143 Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerDied","Data":"15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24"} Mar 13 20:47:26 crc kubenswrapper[4790]: I0313 20:47:26.927445 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerDied","Data":"3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193"} Mar 13 20:47:28 crc kubenswrapper[4790]: I0313 20:47:28.953638 4790 generic.go:334] "Generic (PLEG): container finished" podID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerID="be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07" exitCode=0 Mar 13 20:47:28 crc kubenswrapper[4790]: I0313 20:47:28.954397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerDied","Data":"be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07"} Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.055782 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.098600 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.100143 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.104005 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.122624 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133555 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133737 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133791 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133836 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.133885 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.136790 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.170189 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-686b857b8-6fghv"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.171587 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.194637 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686b857b8-6fghv"] Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235149 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f5105d-51ea-4e5e-832f-8302188a943a-logs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235226 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235250 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-config-data\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-scripts\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235583 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-tls-certs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235649 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-secret-key\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235715 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235729 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235789 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-combined-ca-bundle\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235822 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-755d7\" (UniqueName: \"kubernetes.io/projected/d0f5105d-51ea-4e5e-832f-8302188a943a-kube-api-access-755d7\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.235845 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.236748 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.236807 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.237517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.246593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.254811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.263707 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.264350 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"horizon-77655f674d-4r7h4\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337617 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-secret-key\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337699 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-combined-ca-bundle\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-755d7\" (UniqueName: \"kubernetes.io/projected/d0f5105d-51ea-4e5e-832f-8302188a943a-kube-api-access-755d7\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337745 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f5105d-51ea-4e5e-832f-8302188a943a-logs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-config-data\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337812 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-scripts\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.337869 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-tls-certs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.338233 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d0f5105d-51ea-4e5e-832f-8302188a943a-logs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.339216 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-scripts\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.339401 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d0f5105d-51ea-4e5e-832f-8302188a943a-config-data\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.341183 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-combined-ca-bundle\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.344423 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-secret-key\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.345752 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0f5105d-51ea-4e5e-832f-8302188a943a-horizon-tls-certs\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.362564 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-755d7\" (UniqueName: \"kubernetes.io/projected/d0f5105d-51ea-4e5e-832f-8302188a943a-kube-api-access-755d7\") pod \"horizon-686b857b8-6fghv\" (UID: \"d0f5105d-51ea-4e5e-832f-8302188a943a\") " pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.427674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:47:30 crc kubenswrapper[4790]: I0313 20:47:30.494004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.094035 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.168764 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.169016 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" containerID="cri-o://267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942" gracePeriod=10 Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.994342 4790 generic.go:334] "Generic (PLEG): container finished" podID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerID="267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942" exitCode=0 Mar 13 20:47:32 crc kubenswrapper[4790]: I0313 20:47:32.994400 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerDied","Data":"267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942"} Mar 13 20:47:36 crc kubenswrapper[4790]: I0313 20:47:36.776078 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.007601 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.028455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"315f7d58-8f13-4982-a1d6-25b3773f0b1a","Type":"ContainerDied","Data":"fea6afc911ca7e2dd3477729e74613955f874b4583f2a3acc5c3410afdd753e9"} Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.028510 4790 scope.go:117] "RemoveContainer" containerID="d5c5cf67155ec3e79570c279fba860c4f2a62f631dbcd5d56f4fd892ff4992f2" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.028535 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.193039 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194630 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194688 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194873 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194930 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.194990 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195068 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195153 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") pod \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\" (UID: \"315f7d58-8f13-4982-a1d6-25b3773f0b1a\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195642 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs" (OuterVolumeSpecName: "logs") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.195657 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.200557 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts" (OuterVolumeSpecName: "scripts") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.212591 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w" (OuterVolumeSpecName: "kube-api-access-6cj5w") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "kube-api-access-6cj5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.214559 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.221970 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.239912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.249087 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data" (OuterVolumeSpecName: "config-data") pod "315f7d58-8f13-4982-a1d6-25b3773f0b1a" (UID: "315f7d58-8f13-4982-a1d6-25b3773f0b1a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297576 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297621 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297637 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297652 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cj5w\" (UniqueName: \"kubernetes.io/projected/315f7d58-8f13-4982-a1d6-25b3773f0b1a-kube-api-access-6cj5w\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297664 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/315f7d58-8f13-4982-a1d6-25b3773f0b1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297702 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.297717 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/315f7d58-8f13-4982-a1d6-25b3773f0b1a-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.322275 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.365314 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.378285 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391150 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: E0313 20:47:37.391649 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391664 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" Mar 13 20:47:37 crc kubenswrapper[4790]: E0313 20:47:37.391716 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391726 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391956 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-httpd" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.391979 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" containerName="glance-log" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.393112 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.401916 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.413434 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.413691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.432962 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.445866 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.533943 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534200 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534263 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.534337 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") pod \"9acddcdc-720a-469a-8023-7762f1b7c025\" (UID: \"9acddcdc-720a-469a-8023-7762f1b7c025\") " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.535084 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.535155 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs" (OuterVolumeSpecName: "logs") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537078 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537309 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537400 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537475 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537513 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537703 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.537717 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9acddcdc-720a-469a-8023-7762f1b7c025-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.540661 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts" (OuterVolumeSpecName: "scripts") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.543028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2" (OuterVolumeSpecName: "kube-api-access-rm7n2") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "kube-api-access-rm7n2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.564072 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.569458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.611915 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data" (OuterVolumeSpecName: "config-data") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.632265 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9acddcdc-720a-469a-8023-7762f1b7c025" (UID: "9acddcdc-720a-469a-8023-7762f1b7c025"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646418 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646526 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646575 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646601 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646657 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646686 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646730 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646740 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646751 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rm7n2\" (UniqueName: \"kubernetes.io/projected/9acddcdc-720a-469a-8023-7762f1b7c025-kube-api-access-rm7n2\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646772 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646780 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.646789 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9acddcdc-720a-469a-8023-7762f1b7c025-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.655200 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.655757 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.656070 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.661346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.668357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.674540 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.678360 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.678994 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.683435 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="315f7d58-8f13-4982-a1d6-25b3773f0b1a" path="/var/lib/kubelet/pods/315f7d58-8f13-4982-a1d6-25b3773f0b1a/volumes" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.684923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.694312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.739237 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:37 crc kubenswrapper[4790]: I0313 20:47:37.750368 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.037903 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"9acddcdc-720a-469a-8023-7762f1b7c025","Type":"ContainerDied","Data":"ae6303d4ad793ad0e64f0d47ea54d176c052de04140f8c58d197bb273bafc45e"} Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.039061 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.070011 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.088048 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.103475 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: E0313 20:47:38.103931 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.103944 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" Mar 13 20:47:38 crc kubenswrapper[4790]: E0313 20:47:38.103972 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.103978 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.104131 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-log" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.104144 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" containerName="glance-httpd" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.105149 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.110896 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.111180 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.118259 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164074 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164192 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164254 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164344 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164478 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.164518 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266226 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266398 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266459 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266486 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266516 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266550 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266658 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.266970 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.267213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.267590 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273088 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273167 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.273727 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.282610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.291687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " pod="openstack/glance-default-external-api-0" Mar 13 20:47:38 crc kubenswrapper[4790]: I0313 20:47:38.431785 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:47:39 crc kubenswrapper[4790]: I0313 20:47:39.672636 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9acddcdc-720a-469a-8023-7762f1b7c025" path="/var/lib/kubelet/pods/9acddcdc-720a-469a-8023-7762f1b7c025/volumes" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.563807 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.563980 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n8h694h675h665h85h67ch86h97h5h66dhc6h6h9fhb6h8hc5h5cfh67fh577h6dhdch578h69h58fh594h5b7h56dhdbhc8h68bh645h55q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bdq79,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1abdfade-817b-4659-b8be-48bb516fb866): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.583641 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.583841 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n665h5ch5bh66dh66bh94h67h9fh5d6h5d4h56dh58ch7bh697h8h99h74hbch569h5dhbfh54bh64dh5c4h59h84h59dh589h5b5h58hf4h5b8q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v2wvs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-58656c768f-spczn_openstack(4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.585874 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-58656c768f-spczn" podUID="4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.597721 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.597909 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n595h4h568h5ch68fh5fbh588h645h8ch64h694h666h5fch586h65bh546h564h699h645h5fbh69h696h6chbch5b7h5f6hdch9ch5cfh8dh88h8dq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lq4pr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-54dbf7ffd5-z6rf5_openstack(bdc44913-44bd-4899-8f7b-d4908bad33c3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:40 crc kubenswrapper[4790]: E0313 20:47:40.600784 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-54dbf7ffd5-z6rf5" podUID="bdc44913-44bd-4899-8f7b-d4908bad33c3" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.629167 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708732 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708873 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708921 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.708997 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.709114 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.709244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") pod \"5c397c6e-8d19-4b92-bc31-61312531b3d9\" (UID: \"5c397c6e-8d19-4b92-bc31-61312531b3d9\") " Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.714113 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts" (OuterVolumeSpecName: "scripts") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.714352 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d" (OuterVolumeSpecName: "kube-api-access-kcf2d") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "kube-api-access-kcf2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.721011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.731001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.736000 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data" (OuterVolumeSpecName: "config-data") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.744805 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c397c6e-8d19-4b92-bc31-61312531b3d9" (UID: "5c397c6e-8d19-4b92-bc31-61312531b3d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811770 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811805 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811814 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811822 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811831 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kcf2d\" (UniqueName: \"kubernetes.io/projected/5c397c6e-8d19-4b92-bc31-61312531b3d9-kube-api-access-kcf2d\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:40 crc kubenswrapper[4790]: I0313 20:47:40.811839 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c397c6e-8d19-4b92-bc31-61312531b3d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.065486 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rfr4j" event={"ID":"5c397c6e-8d19-4b92-bc31-61312531b3d9","Type":"ContainerDied","Data":"f6dce01b5701bc8518d8b3503d0cb256f3f959202a6e0fa8b6a41a1cef8da1af"} Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.065854 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6dce01b5701bc8518d8b3503d0cb256f3f959202a6e0fa8b6a41a1cef8da1af" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.065660 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rfr4j" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.707241 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.713950 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rfr4j"] Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.798359 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:47:41 crc kubenswrapper[4790]: E0313 20:47:41.798897 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerName="keystone-bootstrap" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.798923 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerName="keystone-bootstrap" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.799123 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" containerName="keystone-bootstrap" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.799863 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802524 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802670 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802828 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802859 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.802931 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.811851 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850828 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850898 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.850935 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.851003 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.851081 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.952692 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953054 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953135 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953289 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.953434 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.958920 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.959015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.959127 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.959137 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.967754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:41 crc kubenswrapper[4790]: I0313 20:47:41.969104 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"keystone-bootstrap-m4zxn\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:42 crc kubenswrapper[4790]: I0313 20:47:42.128843 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:47:43 crc kubenswrapper[4790]: I0313 20:47:43.674155 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c397c6e-8d19-4b92-bc31-61312531b3d9" path="/var/lib/kubelet/pods/5c397c6e-8d19-4b92-bc31-61312531b3d9/volumes" Mar 13 20:47:44 crc kubenswrapper[4790]: I0313 20:47:44.016050 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:47:44 crc kubenswrapper[4790]: I0313 20:47:44.016120 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:47:46 crc kubenswrapper[4790]: I0313 20:47:46.775978 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.134428 4790 generic.go:334] "Generic (PLEG): container finished" podID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerID="0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca" exitCode=0 Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.134487 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerDied","Data":"0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca"} Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.632441 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831645 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831719 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831747 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831813 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.831871 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") pod \"34d41874-8dfa-4e3d-9298-d027a3e3c921\" (UID: \"34d41874-8dfa-4e3d-9298-d027a3e3c921\") " Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.850418 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9" (OuterVolumeSpecName: "kube-api-access-rxcd9") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "kube-api-access-rxcd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.880458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.892733 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.894251 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config" (OuterVolumeSpecName: "config") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.904101 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.906684 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "34d41874-8dfa-4e3d-9298-d027a3e3c921" (UID: "34d41874-8dfa-4e3d-9298-d027a3e3c921"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934891 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934930 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934941 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934950 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934958 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/34d41874-8dfa-4e3d-9298-d027a3e3c921-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:47 crc kubenswrapper[4790]: I0313 20:47:47.934968 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rxcd9\" (UniqueName: \"kubernetes.io/projected/34d41874-8dfa-4e3d-9298-d027a3e3c921-kube-api-access-rxcd9\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.090465 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.090650 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tw2bf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-kkmzk_openstack(5dff6930-5d07-4df7-8d42-470ae83afd38): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.091859 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-kkmzk" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.118267 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.124989 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.152287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54dbf7ffd5-z6rf5" event={"ID":"bdc44913-44bd-4899-8f7b-d4908bad33c3","Type":"ContainerDied","Data":"bb504ca44dce1509b49f6335cf8ea3f3e23e6aefb5e7712baa5b25d4cf19fcdc"} Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.152316 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54dbf7ffd5-z6rf5" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.155782 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" event={"ID":"34d41874-8dfa-4e3d-9298-d027a3e3c921","Type":"ContainerDied","Data":"deda6dfa1a9df2280b428b849e29fe6809f9079a777def89e5ae47fabd177aa8"} Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.155864 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.161577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-58656c768f-spczn" event={"ID":"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6","Type":"ContainerDied","Data":"5c692e4cb4d1561525c952071763fb787c93bbc98ee4b7e875e7458714b9da0a"} Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.161813 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-58656c768f-spczn" Mar 13 20:47:48 crc kubenswrapper[4790]: E0313 20:47:48.164992 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-kkmzk" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.211957 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.222858 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6bcbc87-v8dxb"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239124 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239166 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239282 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239370 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239466 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239512 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239537 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") pod \"bdc44913-44bd-4899-8f7b-d4908bad33c3\" (UID: \"bdc44913-44bd-4899-8f7b-d4908bad33c3\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239573 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239683 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs" (OuterVolumeSpecName: "logs") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.239986 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") pod \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\" (UID: \"4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6\") " Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240206 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data" (OuterVolumeSpecName: "config-data") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240542 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240571 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data" (OuterVolumeSpecName: "config-data") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240842 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs" (OuterVolumeSpecName: "logs") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.240895 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts" (OuterVolumeSpecName: "scripts") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.241356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts" (OuterVolumeSpecName: "scripts") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244001 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244059 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs" (OuterVolumeSpecName: "kube-api-access-v2wvs") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "kube-api-access-v2wvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244093 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr" (OuterVolumeSpecName: "kube-api-access-lq4pr") pod "bdc44913-44bd-4899-8f7b-d4908bad33c3" (UID: "bdc44913-44bd-4899-8f7b-d4908bad33c3"). InnerVolumeSpecName "kube-api-access-lq4pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.244548 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" (UID: "4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341590 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bdc44913-44bd-4899-8f7b-d4908bad33c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341617 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2wvs\" (UniqueName: \"kubernetes.io/projected/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-kube-api-access-v2wvs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341628 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341647 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bdc44913-44bd-4899-8f7b-d4908bad33c3-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341656 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341666 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341674 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lq4pr\" (UniqueName: \"kubernetes.io/projected/bdc44913-44bd-4899-8f7b-d4908bad33c3-kube-api-access-lq4pr\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.341683 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/bdc44913-44bd-4899-8f7b-d4908bad33c3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.524151 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.536497 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54dbf7ffd5-z6rf5"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.554026 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:48 crc kubenswrapper[4790]: I0313 20:47:48.560174 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-58656c768f-spczn"] Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.310516 4790 scope.go:117] "RemoveContainer" containerID="f7f9d1b3630ded700d395f27b979ba4f7e495b7e7a9a351c560af831a148e3d8" Mar 13 20:47:49 crc kubenswrapper[4790]: E0313 20:47:49.342091 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 13 20:47:49 crc kubenswrapper[4790]: E0313 20:47:49.342286 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8mdhq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-g2nmn_openstack(32ffb609-7a3b-42b7-b513-7003deefe5dd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 20:47:49 crc kubenswrapper[4790]: E0313 20:47:49.343580 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-g2nmn" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.431267 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.490011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") pod \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.490051 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") pod \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.490118 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") pod \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\" (UID: \"eef97bfb-4275-4a0a-bae4-5442cf7400dd\") " Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.506080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6" (OuterVolumeSpecName: "kube-api-access-llgn6") pod "eef97bfb-4275-4a0a-bae4-5442cf7400dd" (UID: "eef97bfb-4275-4a0a-bae4-5442cf7400dd"). InnerVolumeSpecName "kube-api-access-llgn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.516040 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "eef97bfb-4275-4a0a-bae4-5442cf7400dd" (UID: "eef97bfb-4275-4a0a-bae4-5442cf7400dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.522270 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config" (OuterVolumeSpecName: "config") pod "eef97bfb-4275-4a0a-bae4-5442cf7400dd" (UID: "eef97bfb-4275-4a0a-bae4-5442cf7400dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.591572 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.591629 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llgn6\" (UniqueName: \"kubernetes.io/projected/eef97bfb-4275-4a0a-bae4-5442cf7400dd-kube-api-access-llgn6\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.591646 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/eef97bfb-4275-4a0a-bae4-5442cf7400dd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.674074 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" path="/var/lib/kubelet/pods/34d41874-8dfa-4e3d-9298-d027a3e3c921/volumes" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.674930 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6" path="/var/lib/kubelet/pods/4f7f54d0-0f93-497b-b5cb-2a35d7dc68f6/volumes" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.675459 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdc44913-44bd-4899-8f7b-d4908bad33c3" path="/var/lib/kubelet/pods/bdc44913-44bd-4899-8f7b-d4908bad33c3/volumes" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.798553 4790 scope.go:117] "RemoveContainer" containerID="15ece6b8455b1981485fe94641a2ac4a65bad6b4ff6e1fde766f9de31aa3ea24" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.907796 4790 scope.go:117] "RemoveContainer" containerID="3616d5067b96b923fdb7687f09ebcaab7f10f570884d08425f74771298051193" Mar 13 20:47:49 crc kubenswrapper[4790]: I0313 20:47:49.994436 4790 scope.go:117] "RemoveContainer" containerID="267c9dc16cc049015bd4edf304ecb796705f9133394d3f5b1d188823da72e942" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.045757 4790 scope.go:117] "RemoveContainer" containerID="03a87f5d6c3388f53ac8b07b4a8345caa059485eb6f71dad3953ac168c0ce643" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.152562 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.184404 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-mg4xg" event={"ID":"eef97bfb-4275-4a0a-bae4-5442cf7400dd","Type":"ContainerDied","Data":"e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0"} Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.184441 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e447298b31e6ff846ad9bdd5d124a1ff7c9fbf3f3daae17551c3e92c51b821f0" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.184506 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-mg4xg" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.190246 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerStarted","Data":"32071f4748bdbdbbb2169f1b2a9fc194d9a40accb2c6784c59874d08e8b9f3b6"} Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.191304 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-g2nmn" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.270110 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-686b857b8-6fghv"] Mar 13 20:47:50 crc kubenswrapper[4790]: W0313 20:47:50.272484 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0f5105d_51ea_4e5e_832f_8302188a943a.slice/crio-6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221 WatchSource:0}: Error finding container 6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221: Status 404 returned error can't find the container with id 6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221 Mar 13 20:47:50 crc kubenswrapper[4790]: W0313 20:47:50.354769 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod773bad92_580e_4a9c_9ba5_eef9d8bbc40d.slice/crio-b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd WatchSource:0}: Error finding container b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd: Status 404 returned error can't find the container with id b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.358506 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.373757 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.392155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 13 20:47:50 crc kubenswrapper[4790]: W0313 20:47:50.517054 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ed0eb88_051d_48ad_a934_3cfb7dbcd0f2.slice/crio-ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5 WatchSource:0}: Error finding container ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5: Status 404 returned error can't find the container with id ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5 Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.518368 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.735639 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.736098 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736112 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.736125 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="init" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736133 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="init" Mar 13 20:47:50 crc kubenswrapper[4790]: E0313 20:47:50.736150 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerName="neutron-db-sync" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736158 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerName="neutron-db-sync" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736352 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" containerName="neutron-db-sync" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.736371 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.738079 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.757044 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821273 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821488 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821520 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.821577 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.885699 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.887002 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.891677 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-6dm5h" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.891933 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.898298 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.913773 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.923431 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924359 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924403 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924430 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.924501 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.925322 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.934878 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.935697 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.936700 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.935549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:50 crc kubenswrapper[4790]: I0313 20:47:50.973325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"dnsmasq-dns-55f844cf75-87qd2\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029093 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029226 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029313 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.029355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.130973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131094 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.131680 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.134993 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.137092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.137227 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.137662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.148456 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"neutron-5fc7fb5bf6-ctr9l\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.156825 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.190163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.207111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerStarted","Data":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.214502 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerStarted","Data":"de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.214537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerStarted","Data":"c83d34d6ac5900a556ff1a044d46cc3895eff153b6e266aeaf785a861d60ccbe"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.234677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.243222 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerStarted","Data":"ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.246709 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerStarted","Data":"b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.249090 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686b857b8-6fghv" event={"ID":"d0f5105d-51ea-4e5e-832f-8302188a943a","Type":"ContainerStarted","Data":"f1a3bc5c28291092364d29567ef31da8206a941647d2233217845f100a2524d3"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.249132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686b857b8-6fghv" event={"ID":"d0f5105d-51ea-4e5e-832f-8302188a943a","Type":"ContainerStarted","Data":"6723085da12e953e7a5a4368be76839cdf9f59193d4c956744420d89e3776221"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.274642 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerStarted","Data":"51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.284026 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerStarted","Data":"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483"} Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.314253 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-wbb8v" podStartSLOduration=5.082679712 podStartE2EDuration="30.314233089s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="2026-03-13 20:47:22.874153372 +0000 UTC m=+1173.895269263" lastFinishedPulling="2026-03-13 20:47:48.105706749 +0000 UTC m=+1199.126822640" observedRunningTime="2026-03-13 20:47:51.302636563 +0000 UTC m=+1202.323752454" watchObservedRunningTime="2026-03-13 20:47:51.314233089 +0000 UTC m=+1202.335348980" Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.771635 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:47:51 crc kubenswrapper[4790]: I0313 20:47:51.777198 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-74f6bcbc87-v8dxb" podUID="34d41874-8dfa-4e3d-9298-d027a3e3c921" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Mar 13 20:47:52 crc kubenswrapper[4790]: W0313 20:47:52.116470 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96f53d5c_8b27_4810_a760_f7c9a4ee567b.slice/crio-244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0 WatchSource:0}: Error finding container 244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0: Status 404 returned error can't find the container with id 244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.117413 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.293530 4790 generic.go:334] "Generic (PLEG): container finished" podID="6792eda6-a284-42ab-a650-f21b012f7f44" containerID="c85d717e10fb599c6b3d50e3cfc797654ac9faa539262c4f8824bda9117967e3" exitCode=0 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.293601 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerDied","Data":"c85d717e10fb599c6b3d50e3cfc797654ac9faa539262c4f8824bda9117967e3"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.293849 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerStarted","Data":"eb26cba5d4f1f28cf0c444cc204a575c1f7bb95d1f8f9337a19506bba53fe819"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.304510 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-686b857b8-6fghv" event={"ID":"d0f5105d-51ea-4e5e-832f-8302188a943a","Type":"ContainerStarted","Data":"29f470a4b7bd1504791f24aa94bc2135d812fc6b51bc685b80f06c023dc9d304"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.309731 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerStarted","Data":"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.323284 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerStarted","Data":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.323526 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76485b6c5-pjfp4" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" containerID="cri-o://f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" gracePeriod=30 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.323755 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-76485b6c5-pjfp4" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" containerID="cri-o://45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" gracePeriod=30 Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.339136 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerStarted","Data":"244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.344236 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerStarted","Data":"4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.347477 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-77655f674d-4r7h4" podStartSLOduration=22.34745474 podStartE2EDuration="22.34745474s" podCreationTimestamp="2026-03-13 20:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:52.335576477 +0000 UTC m=+1203.356692368" watchObservedRunningTime="2026-03-13 20:47:52.34745474 +0000 UTC m=+1203.368570631" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.360262 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerStarted","Data":"6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6"} Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.370315 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.370556 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-686b857b8-6fghv" podStartSLOduration=22.370531699 podStartE2EDuration="22.370531699s" podCreationTimestamp="2026-03-13 20:47:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:52.354149383 +0000 UTC m=+1203.375265284" watchObservedRunningTime="2026-03-13 20:47:52.370531699 +0000 UTC m=+1203.391647590" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.397009 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-76485b6c5-pjfp4" podStartSLOduration=6.364096975 podStartE2EDuration="31.393352793s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="2026-03-13 20:47:23.076880732 +0000 UTC m=+1174.097996623" lastFinishedPulling="2026-03-13 20:47:48.10613655 +0000 UTC m=+1199.127252441" observedRunningTime="2026-03-13 20:47:52.381862038 +0000 UTC m=+1203.402977929" watchObservedRunningTime="2026-03-13 20:47:52.393352793 +0000 UTC m=+1203.414468684" Mar 13 20:47:52 crc kubenswrapper[4790]: I0313 20:47:52.420715 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m4zxn" podStartSLOduration=11.420692338 podStartE2EDuration="11.420692338s" podCreationTimestamp="2026-03-13 20:47:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:52.404003812 +0000 UTC m=+1203.425119703" watchObservedRunningTime="2026-03-13 20:47:52.420692338 +0000 UTC m=+1203.441808229" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.166539 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.168635 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.173743 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.174094 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.180878 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291330 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291392 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291417 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291444 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291507 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.291533 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.373443 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerStarted","Data":"8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.373753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerStarted","Data":"449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.374048 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.386906 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerStarted","Data":"96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.389826 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerStarted","Data":"5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.392793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerStarted","Data":"de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8"} Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393953 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.393989 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.394015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.394041 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.409233 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5fc7fb5bf6-ctr9l" podStartSLOduration=3.409212899 podStartE2EDuration="3.409212899s" podCreationTimestamp="2026-03-13 20:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.395842014 +0000 UTC m=+1204.416957915" watchObservedRunningTime="2026-03-13 20:47:53.409212899 +0000 UTC m=+1204.430328790" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.409293 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.410917 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.411687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.418315 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.431084 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.441243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.455074 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"neutron-655d56d4d9-rckws\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.487980 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=15.487955657 podStartE2EDuration="15.487955657s" podCreationTimestamp="2026-03-13 20:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.44148383 +0000 UTC m=+1204.462599741" watchObservedRunningTime="2026-03-13 20:47:53.487955657 +0000 UTC m=+1204.509071548" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.489203 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" podStartSLOduration=3.489194901 podStartE2EDuration="3.489194901s" podCreationTimestamp="2026-03-13 20:47:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.484870533 +0000 UTC m=+1204.505986424" watchObservedRunningTime="2026-03-13 20:47:53.489194901 +0000 UTC m=+1204.510310802" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.531019 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=16.530998251 podStartE2EDuration="16.530998251s" podCreationTimestamp="2026-03-13 20:47:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:47:53.522553961 +0000 UTC m=+1204.543669852" watchObservedRunningTime="2026-03-13 20:47:53.530998251 +0000 UTC m=+1204.552114142" Mar 13 20:47:53 crc kubenswrapper[4790]: I0313 20:47:53.543747 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.192789 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:47:54 crc kubenswrapper[4790]: W0313 20:47:54.205990 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podebc86632_179c_403a_bbdd_d496a21c018c.slice/crio-84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc WatchSource:0}: Error finding container 84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc: Status 404 returned error can't find the container with id 84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.408869 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerStarted","Data":"84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc"} Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.411858 4790 generic.go:334] "Generic (PLEG): container finished" podID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerID="51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196" exitCode=0 Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.412840 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerDied","Data":"51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196"} Mar 13 20:47:54 crc kubenswrapper[4790]: I0313 20:47:54.412882 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.156350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288771 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288815 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288909 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.288965 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") pod \"e8b8bbca-4be9-43d3-b692-0587892a50b4\" (UID: \"e8b8bbca-4be9-43d3-b692-0587892a50b4\") " Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.289778 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs" (OuterVolumeSpecName: "logs") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.295830 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8" (OuterVolumeSpecName: "kube-api-access-6s8z8") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "kube-api-access-6s8z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.296491 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts" (OuterVolumeSpecName: "scripts") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.316294 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.318580 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data" (OuterVolumeSpecName: "config-data") pod "e8b8bbca-4be9-43d3-b692-0587892a50b4" (UID: "e8b8bbca-4be9-43d3-b692-0587892a50b4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391548 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391594 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391608 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8b8bbca-4be9-43d3-b692-0587892a50b4-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391620 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8b8bbca-4be9-43d3-b692-0587892a50b4-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.391632 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6s8z8\" (UniqueName: \"kubernetes.io/projected/e8b8bbca-4be9-43d3-b692-0587892a50b4-kube-api-access-6s8z8\") on node \"crc\" DevicePath \"\"" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.444517 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerStarted","Data":"9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0"} Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.446921 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-wbb8v" event={"ID":"e8b8bbca-4be9-43d3-b692-0587892a50b4","Type":"ContainerDied","Data":"a21e9d9d91f185d12ea208152b90be684a6632b64a67e174026d61018c3b2d9d"} Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.446965 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a21e9d9d91f185d12ea208152b90be684a6632b64a67e174026d61018c3b2d9d" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.447014 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-wbb8v" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.739636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.739694 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.772741 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:57 crc kubenswrapper[4790]: I0313 20:47:57.782571 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.255197 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:47:58 crc kubenswrapper[4790]: E0313 20:47:58.255764 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerName="placement-db-sync" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.255788 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerName="placement-db-sync" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.256026 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" containerName="placement-db-sync" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.257298 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.259759 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.259895 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-2nkvl" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.260164 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.261091 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.261508 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.278051 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311189 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311358 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311416 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311460 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.311483 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412606 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412681 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412763 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412829 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.412901 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.413587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.418937 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.423676 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.423905 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.427316 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.429610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.429813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"placement-6cd9b448d6-w8fcr\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.433137 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.434015 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.467888 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerID="de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb" exitCode=0 Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.469132 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerDied","Data":"de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb"} Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.469175 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.470249 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.470394 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.544430 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:47:58 crc kubenswrapper[4790]: I0313 20:47:58.581494 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:47:59 crc kubenswrapper[4790]: I0313 20:47:59.479831 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:47:59 crc kubenswrapper[4790]: I0313 20:47:59.479955 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.156713 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.163312 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.168899 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.168899 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.173272 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.179108 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.260550 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"auto-csr-approver-29557248-msw96\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.365694 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"auto-csr-approver-29557248-msw96\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.412089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"auto-csr-approver-29557248-msw96\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.428488 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.428527 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.488688 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.492270 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.492295 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.497485 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.498477 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:00 crc kubenswrapper[4790]: I0313 20:48:00.925136 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.159648 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.228076 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.228528 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" containerID="cri-o://a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77" gracePeriod=10 Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.505972 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerID="a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77" exitCode=0 Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.506067 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.506052 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerDied","Data":"a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77"} Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.572963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.892572 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:48:01 crc kubenswrapper[4790]: I0313 20:48:01.892680 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:48:02 crc kubenswrapper[4790]: I0313 20:48:02.093544 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: connect: connection refused" Mar 13 20:48:02 crc kubenswrapper[4790]: I0313 20:48:02.305312 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.363667 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450512 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450572 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450610 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450659 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450742 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.450840 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") pod \"dd2c3694-0492-400f-98bd-b3c641edfac0\" (UID: \"dd2c3694-0492-400f-98bd-b3c641edfac0\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.458002 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f" (OuterVolumeSpecName: "kube-api-access-ghp5f") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "kube-api-access-ghp5f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.458729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.459356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts" (OuterVolumeSpecName: "scripts") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.460456 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.484427 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.484456 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data" (OuterVolumeSpecName: "config-data") pod "dd2c3694-0492-400f-98bd-b3c641edfac0" (UID: "dd2c3694-0492-400f-98bd-b3c641edfac0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.542686 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m4zxn" event={"ID":"dd2c3694-0492-400f-98bd-b3c641edfac0","Type":"ContainerDied","Data":"c83d34d6ac5900a556ff1a044d46cc3895eff153b6e266aeaf785a861d60ccbe"} Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.542722 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c83d34d6ac5900a556ff1a044d46cc3895eff153b6e266aeaf785a861d60ccbe" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.542795 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m4zxn" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554816 4790 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554856 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554868 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554881 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554894 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghp5f\" (UniqueName: \"kubernetes.io/projected/dd2c3694-0492-400f-98bd-b3c641edfac0-kube-api-access-ghp5f\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.554908 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd2c3694-0492-400f-98bd-b3c641edfac0-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.752341 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858588 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858671 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.858785 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.859402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.859472 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") pod \"7dc42e6e-503c-4931-87e1-adcbf3469570\" (UID: \"7dc42e6e-503c-4931-87e1-adcbf3469570\") " Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.864804 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977" (OuterVolumeSpecName: "kube-api-access-t7977") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "kube-api-access-t7977". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.902806 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.984712 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.987394 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7977\" (UniqueName: \"kubernetes.io/projected/7dc42e6e-503c-4931-87e1-adcbf3469570-kube-api-access-t7977\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:03 crc kubenswrapper[4790]: I0313 20:48:03.995173 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.005968 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.016939 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.019351 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.041005 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config" (OuterVolumeSpecName: "config") pod "7dc42e6e-503c-4931-87e1-adcbf3469570" (UID: "7dc42e6e-503c-4931-87e1-adcbf3469570"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091430 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091457 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091468 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091477 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.091485 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7dc42e6e-503c-4931-87e1-adcbf3469570-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479355 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c5788df58-llnz4"] Mar 13 20:48:04 crc kubenswrapper[4790]: E0313 20:48:04.479934 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerName="keystone-bootstrap" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479945 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerName="keystone-bootstrap" Mar 13 20:48:04 crc kubenswrapper[4790]: E0313 20:48:04.479965 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479971 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" Mar 13 20:48:04 crc kubenswrapper[4790]: E0313 20:48:04.479983 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="init" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.479990 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="init" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.480159 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" containerName="keystone-bootstrap" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.480187 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" containerName="dnsmasq-dns" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.489158 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.506420 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.506668 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.507740 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.507813 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.508090 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-jntkf" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.508296 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.547931 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5788df58-llnz4"] Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.577168 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerStarted","Data":"062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.596719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.600090 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerStarted","Data":"f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.600166 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerStarted","Data":"5d216af4785a04f3e8536b6945d51a46024ad4cfced21083156e56a883fa3cab"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.629135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-credential-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.629502 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-internal-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.629673 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-config-data\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-combined-ca-bundle\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630532 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-fernet-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-scripts\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630759 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpmcw\" (UniqueName: \"kubernetes.io/projected/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-kube-api-access-vpmcw\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.630893 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-public-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.635398 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-kkmzk" podStartSLOduration=2.039330358 podStartE2EDuration="43.635366208s" podCreationTimestamp="2026-03-13 20:47:21 +0000 UTC" firstStartedPulling="2026-03-13 20:47:22.546774133 +0000 UTC m=+1173.567890024" lastFinishedPulling="2026-03-13 20:48:04.142809993 +0000 UTC m=+1215.163925874" observedRunningTime="2026-03-13 20:48:04.628090299 +0000 UTC m=+1215.649206190" watchObservedRunningTime="2026-03-13 20:48:04.635366208 +0000 UTC m=+1215.656482099" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.638565 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerStarted","Data":"f77f483d75213eae4864a3d19aa92203b67c406a1011bada9c9ab22419c8844d"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.664813 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerStarted","Data":"abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.667462 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.691126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" event={"ID":"7dc42e6e-503c-4931-87e1-adcbf3469570","Type":"ContainerDied","Data":"6f999ef5f392142853bfd734e8f05087e30b1884df0a3fcb1f826bf8ee332e9d"} Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.691169 4790 scope.go:117] "RemoveContainer" containerID="a8c73c53b8b1c75ce690b997abf213fa210d425aaf795980f205e05d12075a77" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.691304 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-785d8bcb8c-n8ckq" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.707752 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-655d56d4d9-rckws" podStartSLOduration=11.707733171 podStartE2EDuration="11.707733171s" podCreationTimestamp="2026-03-13 20:47:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:04.70034653 +0000 UTC m=+1215.721462411" watchObservedRunningTime="2026-03-13 20:48:04.707733171 +0000 UTC m=+1215.728849062" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.719278 4790 scope.go:117] "RemoveContainer" containerID="365da7d7e0570f42f94165ed1add103834755db474d98891c1b86296bdc4478f" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.732173 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-credential-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.732212 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-internal-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733014 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-config-data\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733075 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-combined-ca-bundle\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-fernet-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-scripts\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733159 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpmcw\" (UniqueName: \"kubernetes.io/projected/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-kube-api-access-vpmcw\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.733174 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-public-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.739813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-combined-ca-bundle\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.747651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-public-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.751827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-config-data\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.753827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-credential-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.755897 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-scripts\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.756557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-fernet-keys\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.759704 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-internal-tls-certs\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.761535 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpmcw\" (UniqueName: \"kubernetes.io/projected/4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7-kube-api-access-vpmcw\") pod \"keystone-c5788df58-llnz4\" (UID: \"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7\") " pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.781694 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.796884 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-785d8bcb8c-n8ckq"] Mar 13 20:48:04 crc kubenswrapper[4790]: I0313 20:48:04.854635 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.366064 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c5788df58-llnz4"] Mar 13 20:48:05 crc kubenswrapper[4790]: W0313 20:48:05.371529 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c3cfa50_a4b5_45e0_9cb4_d6a5495f4fb7.slice/crio-76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6 WatchSource:0}: Error finding container 76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6: Status 404 returned error can't find the container with id 76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6 Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.670094 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dc42e6e-503c-4931-87e1-adcbf3469570" path="/var/lib/kubelet/pods/7dc42e6e-503c-4931-87e1-adcbf3469570/volumes" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.704418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5788df58-llnz4" event={"ID":"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7","Type":"ContainerStarted","Data":"35f97c0eddce4a6fe454ce018026e47dd5647b67c41351be553885eff30838d2"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.704454 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c5788df58-llnz4" event={"ID":"4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7","Type":"ContainerStarted","Data":"76c9bfcbd45fc9644160ada88176fbf5325e0f84118a60592e674b2f60e715d6"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.705353 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.707649 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerStarted","Data":"963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.707797 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.721238 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.727319 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerStarted","Data":"3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.730519 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerStarted","Data":"f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9"} Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.750410 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c5788df58-llnz4" podStartSLOduration=1.750366729 podStartE2EDuration="1.750366729s" podCreationTimestamp="2026-03-13 20:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:05.741770514 +0000 UTC m=+1216.762886415" watchObservedRunningTime="2026-03-13 20:48:05.750366729 +0000 UTC m=+1216.771482620" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.761507 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557248-msw96" podStartSLOduration=4.692809664 podStartE2EDuration="5.761486761s" podCreationTimestamp="2026-03-13 20:48:00 +0000 UTC" firstStartedPulling="2026-03-13 20:48:03.982866581 +0000 UTC m=+1215.003982472" lastFinishedPulling="2026-03-13 20:48:05.051543678 +0000 UTC m=+1216.072659569" observedRunningTime="2026-03-13 20:48:05.760449893 +0000 UTC m=+1216.781565784" watchObservedRunningTime="2026-03-13 20:48:05.761486761 +0000 UTC m=+1216.782602652" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.779357 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6cd9b448d6-w8fcr" podStartSLOduration=7.779343879 podStartE2EDuration="7.779343879s" podCreationTimestamp="2026-03-13 20:47:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:05.775779972 +0000 UTC m=+1216.796895863" watchObservedRunningTime="2026-03-13 20:48:05.779343879 +0000 UTC m=+1216.800459770" Mar 13 20:48:05 crc kubenswrapper[4790]: I0313 20:48:05.810027 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-g2nmn" podStartSLOduration=4.165256906 podStartE2EDuration="45.810005105s" podCreationTimestamp="2026-03-13 20:47:20 +0000 UTC" firstStartedPulling="2026-03-13 20:47:22.341598897 +0000 UTC m=+1173.362714788" lastFinishedPulling="2026-03-13 20:48:03.986347096 +0000 UTC m=+1215.007462987" observedRunningTime="2026-03-13 20:48:05.79879838 +0000 UTC m=+1216.819914281" watchObservedRunningTime="2026-03-13 20:48:05.810005105 +0000 UTC m=+1216.831120996" Mar 13 20:48:06 crc kubenswrapper[4790]: I0313 20:48:06.741637 4790 generic.go:334] "Generic (PLEG): container finished" podID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerID="3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a" exitCode=0 Mar 13 20:48:06 crc kubenswrapper[4790]: I0313 20:48:06.741785 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerDied","Data":"3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a"} Mar 13 20:48:07 crc kubenswrapper[4790]: I0313 20:48:07.761237 4790 generic.go:334] "Generic (PLEG): container finished" podID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerID="062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a" exitCode=0 Mar 13 20:48:07 crc kubenswrapper[4790]: I0313 20:48:07.762231 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerDied","Data":"062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a"} Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.158651 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.303012 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") pod \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\" (UID: \"eda4da8c-f54a-4c25-9669-ff180aa0b9a9\") " Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.324357 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9" (OuterVolumeSpecName: "kube-api-access-tc4x9") pod "eda4da8c-f54a-4c25-9669-ff180aa0b9a9" (UID: "eda4da8c-f54a-4c25-9669-ff180aa0b9a9"). InnerVolumeSpecName "kube-api-access-tc4x9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.405841 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tc4x9\" (UniqueName: \"kubernetes.io/projected/eda4da8c-f54a-4c25-9669-ff180aa0b9a9-kube-api-access-tc4x9\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.776332 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557248-msw96" event={"ID":"eda4da8c-f54a-4c25-9669-ff180aa0b9a9","Type":"ContainerDied","Data":"f77f483d75213eae4864a3d19aa92203b67c406a1011bada9c9ab22419c8844d"} Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.776366 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557248-msw96" Mar 13 20:48:08 crc kubenswrapper[4790]: I0313 20:48:08.776401 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f77f483d75213eae4864a3d19aa92203b67c406a1011bada9c9ab22419c8844d" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.359231 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.370761 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557242-lp8qf"] Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.433578 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.528093 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") pod \"5dff6930-5d07-4df7-8d42-470ae83afd38\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.528224 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") pod \"5dff6930-5d07-4df7-8d42-470ae83afd38\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.528252 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") pod \"5dff6930-5d07-4df7-8d42-470ae83afd38\" (UID: \"5dff6930-5d07-4df7-8d42-470ae83afd38\") " Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.546678 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf" (OuterVolumeSpecName: "kube-api-access-tw2bf") pod "5dff6930-5d07-4df7-8d42-470ae83afd38" (UID: "5dff6930-5d07-4df7-8d42-470ae83afd38"). InnerVolumeSpecName "kube-api-access-tw2bf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.546780 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "5dff6930-5d07-4df7-8d42-470ae83afd38" (UID: "5dff6930-5d07-4df7-8d42-470ae83afd38"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.580506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5dff6930-5d07-4df7-8d42-470ae83afd38" (UID: "5dff6930-5d07-4df7-8d42-470ae83afd38"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.630759 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tw2bf\" (UniqueName: \"kubernetes.io/projected/5dff6930-5d07-4df7-8d42-470ae83afd38-kube-api-access-tw2bf\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.630802 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.630816 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/5dff6930-5d07-4df7-8d42-470ae83afd38-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.675582 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6027d153-5f8e-4bb1-8275-9a8df8c533f2" path="/var/lib/kubelet/pods/6027d153-5f8e-4bb1-8275-9a8df8c533f2/volumes" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.790163 4790 generic.go:334] "Generic (PLEG): container finished" podID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerID="f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9" exitCode=0 Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.791139 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerDied","Data":"f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9"} Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.802269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-kkmzk" event={"ID":"5dff6930-5d07-4df7-8d42-470ae83afd38","Type":"ContainerDied","Data":"66bbb0c4358595b69723c41e22c295dc43c704b8a97a66ffe918a90a7b96cb73"} Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.802355 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66bbb0c4358595b69723c41e22c295dc43c704b8a97a66ffe918a90a7b96cb73" Mar 13 20:48:09 crc kubenswrapper[4790]: I0313 20:48:09.802531 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-kkmzk" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.230267 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-798f469b5d-gs7bt"] Mar 13 20:48:10 crc kubenswrapper[4790]: E0313 20:48:10.231091 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerName="oc" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231169 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerName="oc" Mar 13 20:48:10 crc kubenswrapper[4790]: E0313 20:48:10.231252 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerName="barbican-db-sync" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231312 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerName="barbican-db-sync" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231557 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" containerName="barbican-db-sync" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.231641 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" containerName="oc" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.232658 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.239824 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-2zqc7" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.240226 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.240403 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.245656 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d9ddc9bbc-tg88r"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.247055 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.257898 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.273177 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-798f469b5d-gs7bt"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.324789 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d9ddc9bbc-tg88r"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-combined-ca-bundle\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345941 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f92730-30b3-4583-ab7c-258c0a0880a2-logs\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f859x\" (UniqueName: \"kubernetes.io/projected/8a191811-ef81-4066-bcbb-0385c9258fc0-kube-api-access-f859x\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.345979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-combined-ca-bundle\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346030 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data-custom\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346050 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpr6b\" (UniqueName: \"kubernetes.io/projected/98f92730-30b3-4583-ab7c-258c0a0880a2-kube-api-access-vpr6b\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a191811-ef81-4066-bcbb-0385c9258fc0-logs\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346154 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.346173 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data-custom\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.422349 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.423844 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.435552 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.447778 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448082 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data-custom\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-combined-ca-bundle\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448459 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f92730-30b3-4583-ab7c-258c0a0880a2-logs\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f859x\" (UniqueName: \"kubernetes.io/projected/8a191811-ef81-4066-bcbb-0385c9258fc0-kube-api-access-f859x\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448618 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-combined-ca-bundle\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data-custom\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448796 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpr6b\" (UniqueName: \"kubernetes.io/projected/98f92730-30b3-4583-ab7c-258c0a0880a2-kube-api-access-vpr6b\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a191811-ef81-4066-bcbb-0385c9258fc0-logs\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.448985 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.450601 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/98f92730-30b3-4583-ab7c-258c0a0880a2-logs\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.451269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8a191811-ef81-4066-bcbb-0385c9258fc0-logs\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.457948 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.462617 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-combined-ca-bundle\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.464632 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.466388 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a191811-ef81-4066-bcbb-0385c9258fc0-config-data-custom\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.470513 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-combined-ca-bundle\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.478845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data-custom\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.483062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f859x\" (UniqueName: \"kubernetes.io/projected/8a191811-ef81-4066-bcbb-0385c9258fc0-kube-api-access-f859x\") pod \"barbican-keystone-listener-798f469b5d-gs7bt\" (UID: \"8a191811-ef81-4066-bcbb-0385c9258fc0\") " pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.493250 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpr6b\" (UniqueName: \"kubernetes.io/projected/98f92730-30b3-4583-ab7c-258c0a0880a2-kube-api-access-vpr6b\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.500170 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/98f92730-30b3-4583-ab7c-258c0a0880a2-config-data\") pod \"barbican-worker-5d9ddc9bbc-tg88r\" (UID: \"98f92730-30b3-4583-ab7c-258c0a0880a2\") " pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.513996 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-686b857b8-6fghv" podUID="d0f5105d-51ea-4e5e-832f-8302188a943a" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551002 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551082 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551179 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.551195 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.555079 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.556678 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.561472 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.587798 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.603100 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.637526 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655111 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655183 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655364 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655478 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655571 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655598 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655797 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655825 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655860 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.655909 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.656548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.656970 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.657322 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.657723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.657923 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.673819 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"dnsmasq-dns-85ff748b95-db5jn\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757758 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757786 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.757822 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.764184 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.764242 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.766349 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.766807 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.779207 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.796572 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"barbican-api-5c748666b-tvhxb\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:10 crc kubenswrapper[4790]: I0313 20:48:10.880984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.021050 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7c6887dbdb-wnl4x"] Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.022707 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.028691 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.028903 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.036200 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6887dbdb-wnl4x"] Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.196730 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-public-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197122 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197408 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-logs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197568 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-internal-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197643 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-combined-ca-bundle\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197697 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data-custom\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.197721 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbp2\" (UniqueName: \"kubernetes.io/projected/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-kube-api-access-snbp2\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.298926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-public-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-logs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299227 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-internal-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-combined-ca-bundle\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299301 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data-custom\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.299316 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbp2\" (UniqueName: \"kubernetes.io/projected/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-kube-api-access-snbp2\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.305730 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-logs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.309911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data-custom\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.315539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-config-data\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.315980 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-public-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.320532 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-combined-ca-bundle\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.323986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-internal-tls-certs\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.326453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbp2\" (UniqueName: \"kubernetes.io/projected/dc5e5f2f-999a-4ae6-82f1-d5942a570a3e-kube-api-access-snbp2\") pod \"barbican-api-7c6887dbdb-wnl4x\" (UID: \"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e\") " pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:13 crc kubenswrapper[4790]: I0313 20:48:13.363698 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.015538 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.015823 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.015863 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.016325 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.016390 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c" gracePeriod=600 Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.878963 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c" exitCode=0 Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.879023 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c"} Mar 13 20:48:14 crc kubenswrapper[4790]: I0313 20:48:14.879289 4790 scope.go:117] "RemoveContainer" containerID="1c2f579c051539fdc9bad07dcbfb84169db8dd999445ba48e52c550831462bdf" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.004342 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132309 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132331 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132459 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132483 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132550 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") pod \"32ffb609-7a3b-42b7-b513-7003deefe5dd\" (UID: \"32ffb609-7a3b-42b7-b513-7003deefe5dd\") " Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.132768 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.141484 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.141573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq" (OuterVolumeSpecName: "kube-api-access-8mdhq") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "kube-api-access-8mdhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.144493 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts" (OuterVolumeSpecName: "scripts") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.186879 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data" (OuterVolumeSpecName: "config-data") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.189159 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32ffb609-7a3b-42b7-b513-7003deefe5dd" (UID: "32ffb609-7a3b-42b7-b513-7003deefe5dd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234875 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234911 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234921 4790 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234930 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/32ffb609-7a3b-42b7-b513-7003deefe5dd-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234940 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32ffb609-7a3b-42b7-b513-7003deefe5dd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.234949 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8mdhq\" (UniqueName: \"kubernetes.io/projected/32ffb609-7a3b-42b7-b513-7003deefe5dd-kube-api-access-8mdhq\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:15 crc kubenswrapper[4790]: E0313 20:48:15.843755 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.890071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-g2nmn" event={"ID":"32ffb609-7a3b-42b7-b513-7003deefe5dd","Type":"ContainerDied","Data":"593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd"} Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.890360 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="593ecf2c1e6edaf48caa97f46955c4d04cc6ddaa0effd6b586de30210f0a0ecd" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.890521 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-g2nmn" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893304 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerStarted","Data":"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4"} Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893453 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" containerID="cri-o://0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" gracePeriod=30 Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893652 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893919 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" containerID="cri-o://8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" gracePeriod=30 Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.893960 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" containerID="cri-o://fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" gracePeriod=30 Mar 13 20:48:15 crc kubenswrapper[4790]: I0313 20:48:15.895766 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb"} Mar 13 20:48:16 crc kubenswrapper[4790]: W0313 20:48:16.100560 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc5e5f2f_999a_4ae6_82f1_d5942a570a3e.slice/crio-ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43 WatchSource:0}: Error finding container ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43: Status 404 returned error can't find the container with id ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.100646 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7c6887dbdb-wnl4x"] Mar 13 20:48:16 crc kubenswrapper[4790]: W0313 20:48:16.103150 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod07188cb9_d2b0_4923_a90c_386eb3525476.slice/crio-4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9 WatchSource:0}: Error finding container 4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9: Status 404 returned error can't find the container with id 4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.112565 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.294654 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: E0313 20:48:16.295349 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerName="cinder-db-sync" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.295367 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerName="cinder-db-sync" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.295567 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" containerName="cinder-db-sync" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.296724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301311 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-9qb6s" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301531 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301558 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.301477 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.319143 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.350946 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.367121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368590 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368766 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.368951 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.369041 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.407467 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.409258 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.424228 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472538 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472654 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472767 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472802 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472820 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472852 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.472894 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.473000 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.487759 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.489017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.493357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.494472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.517480 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"cinder-scheduler-0\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.524782 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.547434 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-798f469b5d-gs7bt"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.556858 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d9ddc9bbc-tg88r"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574256 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574299 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574386 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574453 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.574474 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.575462 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.575531 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.576007 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.576155 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.576992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.600516 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"dnsmasq-dns-5c9776ccc5-5hxds\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.602151 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.604563 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.608772 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.626856 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.661596 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.675979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676356 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676569 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676618 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676695 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.676751 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.775934 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780404 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780476 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780495 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780520 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780568 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780594 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.780988 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.784698 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.784836 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.785979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.788472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.794668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.808218 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"cinder-api-0\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " pod="openstack/cinder-api-0" Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.927336 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" event={"ID":"98f92730-30b3-4583-ab7c-258c0a0880a2","Type":"ContainerStarted","Data":"f42b1074404aaa07048c55aad06d728bced62837ddf9c7379f463379b1627e10"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.937671 4790 generic.go:334] "Generic (PLEG): container finished" podID="07188cb9-d2b0-4923-a90c-386eb3525476" containerID="0a4d05e5a2a7ca428f12a479e9146ff2c5a49d668908d72ddf47d5d430d6d357" exitCode=0 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.937789 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" event={"ID":"07188cb9-d2b0-4923-a90c-386eb3525476","Type":"ContainerDied","Data":"0a4d05e5a2a7ca428f12a479e9146ff2c5a49d668908d72ddf47d5d430d6d357"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.937822 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" event={"ID":"07188cb9-d2b0-4923-a90c-386eb3525476","Type":"ContainerStarted","Data":"4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.941776 4790 generic.go:334] "Generic (PLEG): container finished" podID="1abdfade-817b-4659-b8be-48bb516fb866" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" exitCode=2 Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.941862 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.943966 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6887dbdb-wnl4x" event={"ID":"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e","Type":"ContainerStarted","Data":"b2b8ea66f4e218bdde854ae7f7245f16743983cc1e396a879d914f731601eb3d"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.943991 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6887dbdb-wnl4x" event={"ID":"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e","Type":"ContainerStarted","Data":"ca862add655685f15c776d4d78786cd31901b29a833e216630b00925c1967d43"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.945635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerStarted","Data":"8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.945661 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerStarted","Data":"d42a77ccb8df8e4551bc2d02ca8dcf98b96ca45a3c404d603bdd2962aa71b56a"} Mar 13 20:48:16 crc kubenswrapper[4790]: I0313 20:48:16.947662 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" event={"ID":"8a191811-ef81-4066-bcbb-0385c9258fc0","Type":"ContainerStarted","Data":"9fe1c8529fe60428074588ddddc06140144982d7436f19bdadea3c344503eb7a"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.053280 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.247590 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:17 crc kubenswrapper[4790]: W0313 20:48:17.263997 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dd8df37_b60e_4ef1_9b53_6a59ba59e538.slice/crio-356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030 WatchSource:0}: Error finding container 356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030: Status 404 returned error can't find the container with id 356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030 Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.269592 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.359026 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.460711 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527179 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527235 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527368 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.527821 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") pod \"07188cb9-d2b0-4923-a90c-386eb3525476\" (UID: \"07188cb9-d2b0-4923-a90c-386eb3525476\") " Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.541699 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv" (OuterVolumeSpecName: "kube-api-access-lcrnv") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "kube-api-access-lcrnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.560535 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config" (OuterVolumeSpecName: "config") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.563826 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.564404 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.569693 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.580584 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "07188cb9-d2b0-4923-a90c-386eb3525476" (UID: "07188cb9-d2b0-4923-a90c-386eb3525476"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634430 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634491 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634505 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634518 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634531 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/07188cb9-d2b0-4923-a90c-386eb3525476-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.634543 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lcrnv\" (UniqueName: \"kubernetes.io/projected/07188cb9-d2b0-4923-a90c-386eb3525476-kube-api-access-lcrnv\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.760310 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.957646 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerStarted","Data":"8575cdd5bb4666a6bc6bc6c42910b6563764f091451eed83947cc6f64da3a0eb"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.963628 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerStarted","Data":"356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.974188 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.974292 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-85ff748b95-db5jn" event={"ID":"07188cb9-d2b0-4923-a90c-386eb3525476","Type":"ContainerDied","Data":"4764ae844905c80f49d49c8334596e36411076b75d76e202fe248b7d8e878fc9"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.974362 4790 scope.go:117] "RemoveContainer" containerID="0a4d05e5a2a7ca428f12a479e9146ff2c5a49d668908d72ddf47d5d430d6d357" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.987742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerStarted","Data":"90cf908fda5bfa83deaae1fd0eac95ba601f9eb9da62b0fab2c3af0677ac98b2"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.991630 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7c6887dbdb-wnl4x" event={"ID":"dc5e5f2f-999a-4ae6-82f1-d5942a570a3e","Type":"ContainerStarted","Data":"f1c4acbc184c6d684bf1f645d51e17a734a58a06beda3e715c2df4b5b76fdda8"} Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.991730 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.991757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:17 crc kubenswrapper[4790]: I0313 20:48:17.999612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerStarted","Data":"17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360"} Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.000210 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.000268 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.030680 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.038791 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-85ff748b95-db5jn"] Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.064820 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5c748666b-tvhxb" podStartSLOduration=8.064771729 podStartE2EDuration="8.064771729s" podCreationTimestamp="2026-03-13 20:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:18.044032853 +0000 UTC m=+1229.065148744" watchObservedRunningTime="2026-03-13 20:48:18.064771729 +0000 UTC m=+1229.085887640" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.089785 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7c6887dbdb-wnl4x" podStartSLOduration=6.08975893 podStartE2EDuration="6.08975893s" podCreationTimestamp="2026-03-13 20:48:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:18.077913327 +0000 UTC m=+1229.099029218" watchObservedRunningTime="2026-03-13 20:48:18.08975893 +0000 UTC m=+1229.110874831" Mar 13 20:48:18 crc kubenswrapper[4790]: I0313 20:48:18.335298 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.017343 4790 generic.go:334] "Generic (PLEG): container finished" podID="1abdfade-817b-4659-b8be-48bb516fb866" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" exitCode=0 Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.017512 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c"} Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.026459 4790 generic.go:334] "Generic (PLEG): container finished" podID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerID="54996574df6debfb6f3430b43b232f15654c266d463f051ee19ed34e62244f6c" exitCode=0 Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.026570 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerDied","Data":"54996574df6debfb6f3430b43b232f15654c266d463f051ee19ed34e62244f6c"} Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.030627 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerStarted","Data":"76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca"} Mar 13 20:48:19 crc kubenswrapper[4790]: I0313 20:48:19.669346 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" path="/var/lib/kubelet/pods/07188cb9-d2b0-4923-a90c-386eb3525476/volumes" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.060830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" event={"ID":"98f92730-30b3-4583-ab7c-258c0a0880a2","Type":"ContainerStarted","Data":"4c236a9371cc59f1b41e2ac193fdd9dcd75b79349c6c12f2a9599d496f917fcc"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.061112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" event={"ID":"98f92730-30b3-4583-ab7c-258c0a0880a2","Type":"ContainerStarted","Data":"e13d8c800ad8f3982a06147ccc8adf9e99e1133e8118d3665ddf15123e344c2f"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.073087 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerStarted","Data":"b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.079436 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerStarted","Data":"716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.080712 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.082582 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d9ddc9bbc-tg88r" podStartSLOduration=7.636768565 podStartE2EDuration="10.082556633s" podCreationTimestamp="2026-03-13 20:48:10 +0000 UTC" firstStartedPulling="2026-03-13 20:48:16.563838932 +0000 UTC m=+1227.584954823" lastFinishedPulling="2026-03-13 20:48:19.009627 +0000 UTC m=+1230.030742891" observedRunningTime="2026-03-13 20:48:20.078363469 +0000 UTC m=+1231.099479360" watchObservedRunningTime="2026-03-13 20:48:20.082556633 +0000 UTC m=+1231.103672524" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.089683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" event={"ID":"8a191811-ef81-4066-bcbb-0385c9258fc0","Type":"ContainerStarted","Data":"7191bb7876683751358d6549cd6e52fa76a26e1517da6569114a61bf3468e91f"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.089742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" event={"ID":"8a191811-ef81-4066-bcbb-0385c9258fc0","Type":"ContainerStarted","Data":"f14c639564d76bb3597d69ed741ad05df72a17b061329b6b4b5e69fea9429dd4"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerStarted","Data":"0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7"} Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102235 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" containerID="cri-o://76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca" gracePeriod=30 Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102301 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" containerID="cri-o://0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7" gracePeriod=30 Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.102357 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.113456 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" podStartSLOduration=4.113433866 podStartE2EDuration="4.113433866s" podCreationTimestamp="2026-03-13 20:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:20.111993886 +0000 UTC m=+1231.133109787" watchObservedRunningTime="2026-03-13 20:48:20.113433866 +0000 UTC m=+1231.134549777" Mar 13 20:48:20 crc kubenswrapper[4790]: I0313 20:48:20.140190 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.140172084 podStartE2EDuration="4.140172084s" podCreationTimestamp="2026-03-13 20:48:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:20.136109874 +0000 UTC m=+1231.157225775" watchObservedRunningTime="2026-03-13 20:48:20.140172084 +0000 UTC m=+1231.161287975" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.114996 4790 generic.go:334] "Generic (PLEG): container finished" podID="de946747-1160-46da-bacd-7ac005e29c73" containerID="0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7" exitCode=0 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.115257 4790 generic.go:334] "Generic (PLEG): container finished" podID="de946747-1160-46da-bacd-7ac005e29c73" containerID="76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca" exitCode=143 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.115098 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerDied","Data":"0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7"} Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.115423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerDied","Data":"76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca"} Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.200679 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.218316 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-798f469b5d-gs7bt" podStartSLOduration=8.773690594 podStartE2EDuration="11.21829477s" podCreationTimestamp="2026-03-13 20:48:10 +0000 UTC" firstStartedPulling="2026-03-13 20:48:16.563538493 +0000 UTC m=+1227.584654374" lastFinishedPulling="2026-03-13 20:48:19.008142659 +0000 UTC m=+1230.029258550" observedRunningTime="2026-03-13 20:48:20.164334074 +0000 UTC m=+1231.185449975" watchObservedRunningTime="2026-03-13 20:48:21.21829477 +0000 UTC m=+1232.239410661" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.493363 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.493925 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" containerID="cri-o://9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0" gracePeriod=30 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.494032 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" containerID="cri-o://abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66" gracePeriod=30 Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.529217 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77f687ff4f-d7b7z"] Mar 13 20:48:21 crc kubenswrapper[4790]: E0313 20:48:21.529623 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" containerName="init" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.529645 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" containerName="init" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.529835 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="07188cb9-d2b0-4923-a90c-386eb3525476" containerName="init" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.530711 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.550519 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77f687ff4f-d7b7z"] Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.551775 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": EOF" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-public-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610183 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-httpd-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610222 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610292 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-combined-ca-bundle\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610361 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-internal-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610650 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-ovndb-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.610705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdkt\" (UniqueName: \"kubernetes.io/projected/6c6f5d56-217d-441e-8771-503fd5e681fb-kube-api-access-gfdkt\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712342 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdkt\" (UniqueName: \"kubernetes.io/projected/6c6f5d56-217d-441e-8771-503fd5e681fb-kube-api-access-gfdkt\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-public-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712557 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-httpd-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.712574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.713315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-combined-ca-bundle\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.713350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-internal-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.713445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-ovndb-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.718492 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-internal-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.718746 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-public-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.719453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-httpd-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.719664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-combined-ca-bundle\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.720083 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-ovndb-tls-certs\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.725467 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6c6f5d56-217d-441e-8771-503fd5e681fb-config\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.742145 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdkt\" (UniqueName: \"kubernetes.io/projected/6c6f5d56-217d-441e-8771-503fd5e681fb-kube-api-access-gfdkt\") pod \"neutron-77f687ff4f-d7b7z\" (UID: \"6c6f5d56-217d-441e-8771-503fd5e681fb\") " pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:21 crc kubenswrapper[4790]: I0313 20:48:21.851909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.142036 4790 generic.go:334] "Generic (PLEG): container finished" podID="ebc86632-179c-403a-bbdd-d496a21c018c" containerID="abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66" exitCode=0 Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.142477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerDied","Data":"abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66"} Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.153635 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerStarted","Data":"ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878"} Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.190902 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.46196809 podStartE2EDuration="6.190886137s" podCreationTimestamp="2026-03-13 20:48:16 +0000 UTC" firstStartedPulling="2026-03-13 20:48:17.269306222 +0000 UTC m=+1228.290422123" lastFinishedPulling="2026-03-13 20:48:18.998224279 +0000 UTC m=+1230.019340170" observedRunningTime="2026-03-13 20:48:22.188180393 +0000 UTC m=+1233.209296284" watchObservedRunningTime="2026-03-13 20:48:22.190886137 +0000 UTC m=+1233.212002028" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.409398 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.528125 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.529455 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531074 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs" (OuterVolumeSpecName: "logs") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531246 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531437 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531553 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531701 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.531817 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") pod \"de946747-1160-46da-bacd-7ac005e29c73\" (UID: \"de946747-1160-46da-bacd-7ac005e29c73\") " Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.533932 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.537398 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.537650 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m" (OuterVolumeSpecName: "kube-api-access-k8w5m") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "kube-api-access-k8w5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.537800 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts" (OuterVolumeSpecName: "scripts") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.596477 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data" (OuterVolumeSpecName: "config-data") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.602659 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de946747-1160-46da-bacd-7ac005e29c73" (UID: "de946747-1160-46da-bacd-7ac005e29c73"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637615 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637656 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/de946747-1160-46da-bacd-7ac005e29c73-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637667 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637675 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637689 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/de946747-1160-46da-bacd-7ac005e29c73-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637700 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k8w5m\" (UniqueName: \"kubernetes.io/projected/de946747-1160-46da-bacd-7ac005e29c73-kube-api-access-k8w5m\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.637713 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/de946747-1160-46da-bacd-7ac005e29c73-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.695103 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77f687ff4f-d7b7z"] Mar 13 20:48:22 crc kubenswrapper[4790]: W0313 20:48:22.701932 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c6f5d56_217d_441e_8771_503fd5e681fb.slice/crio-80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872 WatchSource:0}: Error finding container 80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872: Status 404 returned error can't find the container with id 80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872 Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.838215 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:22 crc kubenswrapper[4790]: I0313 20:48:22.915794 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.036012 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.149877 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150233 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150359 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.150513 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") pod \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\" (UID: \"ccb1b2e8-4b05-411b-a540-6507fdd5775f\") " Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.152573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs" (OuterVolumeSpecName: "logs") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.155172 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.166732 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz" (OuterVolumeSpecName: "kube-api-access-bdqgz") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "kube-api-access-bdqgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172739 4790 generic.go:334] "Generic (PLEG): container finished" podID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" exitCode=137 Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172774 4790 generic.go:334] "Generic (PLEG): container finished" podID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" exitCode=137 Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172808 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerDied","Data":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerDied","Data":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172845 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-76485b6c5-pjfp4" event={"ID":"ccb1b2e8-4b05-411b-a540-6507fdd5775f","Type":"ContainerDied","Data":"0043283db8557a3c195bf15d2769b7b9f5dc50b554145edd4e6c6abedb9ab898"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.172860 4790 scope.go:117] "RemoveContainer" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.173053 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-76485b6c5-pjfp4" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.182118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts" (OuterVolumeSpecName: "scripts") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.198940 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"de946747-1160-46da-bacd-7ac005e29c73","Type":"ContainerDied","Data":"8575cdd5bb4666a6bc6bc6c42910b6563764f091451eed83947cc6f64da3a0eb"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.199062 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.199256 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data" (OuterVolumeSpecName: "config-data") pod "ccb1b2e8-4b05-411b-a540-6507fdd5775f" (UID: "ccb1b2e8-4b05-411b-a540-6507fdd5775f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.213567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77f687ff4f-d7b7z" event={"ID":"6c6f5d56-217d-441e-8771-503fd5e681fb","Type":"ContainerStarted","Data":"430ecc8dc6d2c3b6e5e4d4a5d13a83f7c8f4e9ef0f8f462eadd32e1e37375e29"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.213610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77f687ff4f-d7b7z" event={"ID":"6c6f5d56-217d-441e-8771-503fd5e681fb","Type":"ContainerStarted","Data":"80172ef258e47f41d20fe7c918127a8ed517fb3a3a6cd11675295cf2b42c2872"} Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.243334 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252286 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ccb1b2e8-4b05-411b-a540-6507fdd5775f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252327 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdqgz\" (UniqueName: \"kubernetes.io/projected/ccb1b2e8-4b05-411b-a540-6507fdd5775f-kube-api-access-bdqgz\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252339 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252410 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ccb1b2e8-4b05-411b-a540-6507fdd5775f-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.252421 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ccb1b2e8-4b05-411b-a540-6507fdd5775f-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.255481 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264306 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264716 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264734 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264744 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264750 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264769 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264775 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.264787 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264792 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.264987 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.265017 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de946747-1160-46da-bacd-7ac005e29c73" containerName="cinder-api-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.265026 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon-log" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.265039 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" containerName="horizon" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.268101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.312895 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.313315 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.314000 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.314120 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366519 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c42a2a27-f7c5-463b-982a-4dafcac978ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366619 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366638 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42a2a27-f7c5-463b-982a-4dafcac978ad-logs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-scripts\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366682 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366728 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366768 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.366803 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5fv2\" (UniqueName: \"kubernetes.io/projected/c42a2a27-f7c5-463b-982a-4dafcac978ad-kube-api-access-j5fv2\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.376575 4790 scope.go:117] "RemoveContainer" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469445 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5fv2\" (UniqueName: \"kubernetes.io/projected/c42a2a27-f7c5-463b-982a-4dafcac978ad-kube-api-access-j5fv2\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469544 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c42a2a27-f7c5-463b-982a-4dafcac978ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42a2a27-f7c5-463b-982a-4dafcac978ad-logs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469674 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-scripts\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469695 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469724 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.469748 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.473841 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.474103 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c42a2a27-f7c5-463b-982a-4dafcac978ad-logs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.474394 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c42a2a27-f7c5-463b-982a-4dafcac978ad-etc-machine-id\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.480414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.484397 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-scripts\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.484723 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.485898 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-config-data-custom\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.487975 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c42a2a27-f7c5-463b-982a-4dafcac978ad-public-tls-certs\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.496459 4790 scope.go:117] "RemoveContainer" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.498799 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5fv2\" (UniqueName: \"kubernetes.io/projected/c42a2a27-f7c5-463b-982a-4dafcac978ad-kube-api-access-j5fv2\") pod \"cinder-api-0\" (UID: \"c42a2a27-f7c5-463b-982a-4dafcac978ad\") " pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.504321 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": container with ID starting with 45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d not found: ID does not exist" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.504365 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} err="failed to get container status \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": rpc error: code = NotFound desc = could not find container \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": container with ID starting with 45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.504401 4790 scope.go:117] "RemoveContainer" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: E0313 20:48:23.505037 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": container with ID starting with f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488 not found: ID does not exist" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505063 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} err="failed to get container status \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": rpc error: code = NotFound desc = could not find container \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": container with ID starting with f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488 not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505077 4790 scope.go:117] "RemoveContainer" containerID="45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505547 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d"} err="failed to get container status \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": rpc error: code = NotFound desc = could not find container \"45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d\": container with ID starting with 45d8a934f0a1eb1004a034505a07a155513fa0721dbbfc1572652c50c902a80d not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.505577 4790 scope.go:117] "RemoveContainer" containerID="f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.506752 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488"} err="failed to get container status \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": rpc error: code = NotFound desc = could not find container \"f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488\": container with ID starting with f453f8603c16034ea99832775cc0502bd95310df64f28cd17480cc254d8d3488 not found: ID does not exist" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.506802 4790 scope.go:117] "RemoveContainer" containerID="0036043a9c1b9762413a450b9b563143cc9cb9428e1df3c4935884d5cfb2c1e7" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.544843 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.546533 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-655d56d4d9-rckws" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.158:9696/\": dial tcp 10.217.0.158:9696: connect: connection refused" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.557599 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-76485b6c5-pjfp4"] Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.562201 4790 scope.go:117] "RemoveContainer" containerID="76f821fb3e21a6dfe1f8aa6b85c0c0d915466d0811f4b0f0ac00a80caa7a5cca" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.612205 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.678989 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccb1b2e8-4b05-411b-a540-6507fdd5775f" path="/var/lib/kubelet/pods/ccb1b2e8-4b05-411b-a540-6507fdd5775f/volumes" Mar 13 20:48:23 crc kubenswrapper[4790]: I0313 20:48:23.680262 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de946747-1160-46da-bacd-7ac005e29c73" path="/var/lib/kubelet/pods/de946747-1160-46da-bacd-7ac005e29c73/volumes" Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.092631 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 13 20:48:24 crc kubenswrapper[4790]: W0313 20:48:24.095612 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc42a2a27_f7c5_463b_982a_4dafcac978ad.slice/crio-bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda WatchSource:0}: Error finding container bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda: Status 404 returned error can't find the container with id bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.242244 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77f687ff4f-d7b7z" event={"ID":"6c6f5d56-217d-441e-8771-503fd5e681fb","Type":"ContainerStarted","Data":"8e4d8d580c802f5fa347b1de5b751755e61399b2b76ba7aa9c6d3026313afeaa"} Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.242636 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.246856 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c42a2a27-f7c5-463b-982a-4dafcac978ad","Type":"ContainerStarted","Data":"bfa0be9b090e12370c0ace79ae50bfc0dfede9aa6ff57df04c11ed5c2a4e1dda"} Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.266617 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77f687ff4f-d7b7z" podStartSLOduration=3.266599091 podStartE2EDuration="3.266599091s" podCreationTimestamp="2026-03-13 20:48:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:24.261609724 +0000 UTC m=+1235.282725635" watchObservedRunningTime="2026-03-13 20:48:24.266599091 +0000 UTC m=+1235.287714982" Mar 13 20:48:24 crc kubenswrapper[4790]: I0313 20:48:24.748526 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.121096 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-686b857b8-6fghv" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.203204 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.233369 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.297346 4790 generic.go:334] "Generic (PLEG): container finished" podID="ebc86632-179c-403a-bbdd-d496a21c018c" containerID="9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0" exitCode=0 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.297553 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerDied","Data":"9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0"} Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.323190 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c42a2a27-f7c5-463b-982a-4dafcac978ad","Type":"ContainerStarted","Data":"a7466c9393a2e7ac2f9a9cc84c569eed3edbc93a91ed18f031c1147c7828cd61"} Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.325150 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" containerID="cri-o://75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.325175 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" containerID="cri-o://59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.772973 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7c6887dbdb-wnl4x" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.817268 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.856443 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.856713 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" containerID="cri-o://8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.857582 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" containerID="cri-o://17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360" gracePeriod=30 Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920651 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920781 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920818 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.920981 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.921092 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.921127 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") pod \"ebc86632-179c-403a-bbdd-d496a21c018c\" (UID: \"ebc86632-179c-403a-bbdd-d496a21c018c\") " Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.950471 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:25 crc kubenswrapper[4790]: I0313 20:48:25.973997 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b" (OuterVolumeSpecName: "kube-api-access-9968b") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "kube-api-access-9968b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.023410 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.023480 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9968b\" (UniqueName: \"kubernetes.io/projected/ebc86632-179c-403a-bbdd-d496a21c018c-kube-api-access-9968b\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.061081 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config" (OuterVolumeSpecName: "config") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.079859 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.082658 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.086492 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.110516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ebc86632-179c-403a-bbdd-d496a21c018c" (UID: "ebc86632-179c-403a-bbdd-d496a21c018c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126783 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126814 4790 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126823 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126832 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.126840 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ebc86632-179c-403a-bbdd-d496a21c018c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.331909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"c42a2a27-f7c5-463b-982a-4dafcac978ad","Type":"ContainerStarted","Data":"c76b91edd8bb8101c227d42c9cd4c199b919c4c67e9a68dc22a5274872cc82c7"} Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.333208 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.333936 4790 generic.go:334] "Generic (PLEG): container finished" podID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerID="8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581" exitCode=143 Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.334019 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerDied","Data":"8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581"} Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.335773 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-655d56d4d9-rckws" event={"ID":"ebc86632-179c-403a-bbdd-d496a21c018c","Type":"ContainerDied","Data":"84f278a1006894e4224ea478ecf0e8138ab1ab2094c647ca6b5763b2c261a6bc"} Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.335815 4790 scope.go:117] "RemoveContainer" containerID="abc0a2f8a645b936e1377bd5e49e6a7c687f5aa7ff92e068165fc9da1349ac66" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.335828 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-655d56d4d9-rckws" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.364728 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.364703656 podStartE2EDuration="3.364703656s" podCreationTimestamp="2026-03-13 20:48:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:26.36232103 +0000 UTC m=+1237.383436921" watchObservedRunningTime="2026-03-13 20:48:26.364703656 +0000 UTC m=+1237.385819547" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.391436 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.400219 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-655d56d4d9-rckws"] Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.400580 4790 scope.go:117] "RemoveContainer" containerID="9d9f4f92c9adc75b1871526d72226856908062d5bbfea344d681a5684ec5cad0" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.662212 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.779258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.873630 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:48:26 crc kubenswrapper[4790]: I0313 20:48:26.874422 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" containerID="cri-o://de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8" gracePeriod=10 Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.131880 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.349273 4790 generic.go:334] "Generic (PLEG): container finished" podID="6792eda6-a284-42ab-a650-f21b012f7f44" containerID="de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8" exitCode=0 Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.349370 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerDied","Data":"de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8"} Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.405784 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.554867 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653654 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653736 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653829 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653958 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.653986 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.654085 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") pod \"6792eda6-a284-42ab-a650-f21b012f7f44\" (UID: \"6792eda6-a284-42ab-a650-f21b012f7f44\") " Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.667425 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz" (OuterVolumeSpecName: "kube-api-access-6xvqz") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "kube-api-access-6xvqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.677058 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" path="/var/lib/kubelet/pods/ebc86632-179c-403a-bbdd-d496a21c018c/volumes" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.703898 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.707037 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.710752 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.717099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config" (OuterVolumeSpecName: "config") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.728340 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6792eda6-a284-42ab-a650-f21b012f7f44" (UID: "6792eda6-a284-42ab-a650-f21b012f7f44"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756445 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756491 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xvqz\" (UniqueName: \"kubernetes.io/projected/6792eda6-a284-42ab-a650-f21b012f7f44-kube-api-access-6xvqz\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756510 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756523 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756534 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:27 crc kubenswrapper[4790]: I0313 20:48:27.756544 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6792eda6-a284-42ab-a650-f21b012f7f44-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.008331 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.064724 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.363349 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" containerID="cri-o://b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9" gracePeriod=30 Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.363841 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.363865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55f844cf75-87qd2" event={"ID":"6792eda6-a284-42ab-a650-f21b012f7f44","Type":"ContainerDied","Data":"eb26cba5d4f1f28cf0c444cc204a575c1f7bb95d1f8f9337a19506bba53fe819"} Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.364083 4790 scope.go:117] "RemoveContainer" containerID="de1f1e831185e14abf69fe3f42e9442a69f0019e8f482343c4e0783b1bafacb8" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.364180 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" containerID="cri-o://ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878" gracePeriod=30 Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.395967 4790 scope.go:117] "RemoveContainer" containerID="c85d717e10fb599c6b3d50e3cfc797654ac9faa539262c4f8824bda9117967e3" Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.415850 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:48:28 crc kubenswrapper[4790]: I0313 20:48:28.425447 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-55f844cf75-87qd2"] Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.379459 4790 generic.go:334] "Generic (PLEG): container finished" podID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" exitCode=0 Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.379721 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerDied","Data":"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69"} Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.385176 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerID="ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878" exitCode=0 Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.385358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerDied","Data":"ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878"} Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.673268 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" path="/var/lib/kubelet/pods/6792eda6-a284-42ab-a650-f21b012f7f44/volumes" Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.710044 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:29 crc kubenswrapper[4790]: I0313 20:48:29.847064 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058034 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-854ddc4bd-b4ws7"] Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058614 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058638 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058660 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="init" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058668 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="init" Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058678 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058686 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" Mar 13 20:48:30 crc kubenswrapper[4790]: E0313 20:48:30.058706 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058715 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058979 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6792eda6-a284-42ab-a650-f21b012f7f44" containerName="dnsmasq-dns" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.058998 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-api" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.059008 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebc86632-179c-403a-bbdd-d496a21c018c" containerName="neutron-httpd" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.060163 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.069781 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-854ddc4bd-b4ws7"] Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b14c1738-5e9e-4810-b926-5b05af9ec22d-logs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223836 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-public-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223859 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-scripts\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.223875 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-combined-ca-bundle\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.224072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zdh7\" (UniqueName: \"kubernetes.io/projected/b14c1738-5e9e-4810-b926-5b05af9ec22d-kube-api-access-6zdh7\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.224188 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-config-data\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.224424 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-internal-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-internal-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325687 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b14c1738-5e9e-4810-b926-5b05af9ec22d-logs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325746 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-public-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-scripts\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325792 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-combined-ca-bundle\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325853 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zdh7\" (UniqueName: \"kubernetes.io/projected/b14c1738-5e9e-4810-b926-5b05af9ec22d-kube-api-access-6zdh7\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.325895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-config-data\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.326295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b14c1738-5e9e-4810-b926-5b05af9ec22d-logs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.332050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-config-data\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.332052 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-internal-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.335631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-scripts\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.339804 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-public-tls-certs\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.345691 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b14c1738-5e9e-4810-b926-5b05af9ec22d-combined-ca-bundle\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.349109 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zdh7\" (UniqueName: \"kubernetes.io/projected/b14c1738-5e9e-4810-b926-5b05af9ec22d-kube-api-access-6zdh7\") pod \"placement-854ddc4bd-b4ws7\" (UID: \"b14c1738-5e9e-4810-b926-5b05af9ec22d\") " pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.382425 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.429772 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:30 crc kubenswrapper[4790]: W0313 20:48:30.863256 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb14c1738_5e9e_4810_b926_5b05af9ec22d.slice/crio-c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8 WatchSource:0}: Error finding container c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8: Status 404 returned error can't find the container with id c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8 Mar 13 20:48:30 crc kubenswrapper[4790]: I0313 20:48:30.863584 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-854ddc4bd-b4ws7"] Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.276664 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:46044->10.217.0.165:9311: read: connection reset by peer" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.277166 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.277218 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": read tcp 10.217.0.2:46048->10.217.0.165:9311: read: connection reset by peer" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.277523 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-5c748666b-tvhxb" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.165:9311/healthcheck\": dial tcp 10.217.0.165:9311: connect: connection refused" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.422865 4790 generic.go:334] "Generic (PLEG): container finished" podID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerID="b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9" exitCode=0 Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.423117 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerDied","Data":"b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.430540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854ddc4bd-b4ws7" event={"ID":"b14c1738-5e9e-4810-b926-5b05af9ec22d","Type":"ContainerStarted","Data":"0bb4c167ba356dbccb82728022adb0fe6fcad84e2e8b4fa3e49ed62833eda460"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.430602 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854ddc4bd-b4ws7" event={"ID":"b14c1738-5e9e-4810-b926-5b05af9ec22d","Type":"ContainerStarted","Data":"f97b4ab3b27494d831c83dee4c542c033948187c65aed470e1f59fca6513f8ec"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.430616 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-854ddc4bd-b4ws7" event={"ID":"b14c1738-5e9e-4810-b926-5b05af9ec22d","Type":"ContainerStarted","Data":"c12e16de22ade5386b8ff9b548fc3eb5d82ed370ff00bc188e6bdb058c6316c8"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.431081 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.431509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.433987 4790 generic.go:334] "Generic (PLEG): container finished" podID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerID="17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360" exitCode=0 Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.434042 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerDied","Data":"17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360"} Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.456901 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-854ddc4bd-b4ws7" podStartSLOduration=1.456876493 podStartE2EDuration="1.456876493s" podCreationTimestamp="2026-03-13 20:48:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:31.451505646 +0000 UTC m=+1242.472621547" watchObservedRunningTime="2026-03-13 20:48:31.456876493 +0000 UTC m=+1242.477992394" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.708962 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.825742 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.854933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855277 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855314 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855363 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855414 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.855465 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") pod \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\" (UID: \"7dd8df37-b60e-4ef1-9b53-6a59ba59e538\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.856099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.862206 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.863250 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt" (OuterVolumeSpecName: "kube-api-access-srclt") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "kube-api-access-srclt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.875407 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts" (OuterVolumeSpecName: "scripts") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.928150 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957054 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957114 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957279 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957315 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957433 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") pod \"84cf2aee-27d9-4022-8c67-55840b2faedd\" (UID: \"84cf2aee-27d9-4022-8c67-55840b2faedd\") " Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957770 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs" (OuterVolumeSpecName: "logs") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957799 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957849 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srclt\" (UniqueName: \"kubernetes.io/projected/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-kube-api-access-srclt\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957861 4790 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957871 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.957880 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.961575 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl" (OuterVolumeSpecName: "kube-api-access-d2mvl") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "kube-api-access-d2mvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.962360 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.975517 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data" (OuterVolumeSpecName: "config-data") pod "7dd8df37-b60e-4ef1-9b53-6a59ba59e538" (UID: "7dd8df37-b60e-4ef1-9b53-6a59ba59e538"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:31 crc kubenswrapper[4790]: I0313 20:48:31.985036 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.005481 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data" (OuterVolumeSpecName: "config-data") pod "84cf2aee-27d9-4022-8c67-55840b2faedd" (UID: "84cf2aee-27d9-4022-8c67-55840b2faedd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061828 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2mvl\" (UniqueName: \"kubernetes.io/projected/84cf2aee-27d9-4022-8c67-55840b2faedd-kube-api-access-d2mvl\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061871 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7dd8df37-b60e-4ef1-9b53-6a59ba59e538-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061885 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/84cf2aee-27d9-4022-8c67-55840b2faedd-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061896 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061911 4790 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.061921 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/84cf2aee-27d9-4022-8c67-55840b2faedd-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.443910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5c748666b-tvhxb" event={"ID":"84cf2aee-27d9-4022-8c67-55840b2faedd","Type":"ContainerDied","Data":"d42a77ccb8df8e4551bc2d02ca8dcf98b96ca45a3c404d603bdd2962aa71b56a"} Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.443965 4790 scope.go:117] "RemoveContainer" containerID="17f7572a60defea9a0c54762cab48549434213f3829ef554fc1c0c0339839360" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.444071 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5c748666b-tvhxb" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.451296 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.451480 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"7dd8df37-b60e-4ef1-9b53-6a59ba59e538","Type":"ContainerDied","Data":"356f1e6d14418c7b4d47c93a6ddd977f1792b488cf4e7ec1247f3eaca698c030"} Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.471483 4790 scope.go:117] "RemoveContainer" containerID="8d83c8808f4540d59bea2732861e3d03b6d099d9067691314cc326bd7240a581" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.485851 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.496516 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5c748666b-tvhxb"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.503276 4790 scope.go:117] "RemoveContainer" containerID="ffca9dd21fbe9bb0e162063d797b781df50325594db940d15d5f923d328ac878" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.504891 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.514049 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.521625 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.521980 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.521997 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.522009 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522015 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.522040 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522046 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" Mar 13 20:48:32 crc kubenswrapper[4790]: E0313 20:48:32.522069 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522076 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522235 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522248 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="probe" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522261 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" containerName="cinder-scheduler" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.522276 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" containerName="barbican-api-log" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.523144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.526134 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.533720 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.543528 4790 scope.go:117] "RemoveContainer" containerID="b27907adc19d02cf9eb527f95f4e0f1927d997cba55d2e2d8cff7b9730da30e9" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674462 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674896 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.674991 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9jj\" (UniqueName: \"kubernetes.io/projected/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-kube-api-access-fw9jj\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.675025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.675047 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776530 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw9jj\" (UniqueName: \"kubernetes.io/projected/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-kube-api-access-fw9jj\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776610 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776752 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.776866 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.780686 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.780810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-scripts\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.781213 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.793256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw9jj\" (UniqueName: \"kubernetes.io/projected/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-kube-api-access-fw9jj\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.803973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5\") " pod="openstack/cinder-scheduler-0" Mar 13 20:48:32 crc kubenswrapper[4790]: I0313 20:48:32.857582 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.317156 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 13 20:48:33 crc kubenswrapper[4790]: W0313 20:48:33.319967 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ccdc6f2_f911_48c1_b8a8_dc6f2054fed5.slice/crio-23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046 WatchSource:0}: Error finding container 23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046: Status 404 returned error can't find the container with id 23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046 Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.479876 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5","Type":"ContainerStarted","Data":"23fb9682fba1ed858ea3d7cc017852286d7cadc8b134423c5b6577108c2a7046"} Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.671429 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7dd8df37-b60e-4ef1-9b53-6a59ba59e538" path="/var/lib/kubelet/pods/7dd8df37-b60e-4ef1-9b53-6a59ba59e538/volumes" Mar 13 20:48:33 crc kubenswrapper[4790]: I0313 20:48:33.672618 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84cf2aee-27d9-4022-8c67-55840b2faedd" path="/var/lib/kubelet/pods/84cf2aee-27d9-4022-8c67-55840b2faedd/volumes" Mar 13 20:48:34 crc kubenswrapper[4790]: I0313 20:48:34.489180 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5","Type":"ContainerStarted","Data":"5c2d6b68291f7ff6741dc13b76e0434d10b189856a64a8b04fbdb7c15279b680"} Mar 13 20:48:34 crc kubenswrapper[4790]: I0313 20:48:34.489706 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5","Type":"ContainerStarted","Data":"5319bb6a60d6fcc768d905176c5ccd9cc1fe41c0f21dd6d023c85919e06fcc93"} Mar 13 20:48:34 crc kubenswrapper[4790]: I0313 20:48:34.511347 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=2.511331181 podStartE2EDuration="2.511331181s" podCreationTimestamp="2026-03-13 20:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:34.508233107 +0000 UTC m=+1245.529348998" watchObservedRunningTime="2026-03-13 20:48:34.511331181 +0000 UTC m=+1245.532447072" Mar 13 20:48:35 crc kubenswrapper[4790]: I0313 20:48:35.454925 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 13 20:48:36 crc kubenswrapper[4790]: I0313 20:48:36.447765 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-c5788df58-llnz4" Mar 13 20:48:37 crc kubenswrapper[4790]: I0313 20:48:37.858361 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.428785 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.767167 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.768516 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.770713 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.772717 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.782231 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-d69w2" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.784497 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.858485 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.858739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.858785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfjl6\" (UniqueName: \"kubernetes.io/projected/7f0237c2-5c72-4776-9226-67244abca8dd-kube-api-access-pfjl6\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.859095 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961492 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961602 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961716 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.961737 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfjl6\" (UniqueName: \"kubernetes.io/projected/7f0237c2-5c72-4776-9226-67244abca8dd-kube-api-access-pfjl6\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.962352 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.970263 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-openstack-config-secret\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.970453 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f0237c2-5c72-4776-9226-67244abca8dd-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:40 crc kubenswrapper[4790]: I0313 20:48:40.980789 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfjl6\" (UniqueName: \"kubernetes.io/projected/7f0237c2-5c72-4776-9226-67244abca8dd-kube-api-access-pfjl6\") pod \"openstackclient\" (UID: \"7f0237c2-5c72-4776-9226-67244abca8dd\") " pod="openstack/openstackclient" Mar 13 20:48:41 crc kubenswrapper[4790]: I0313 20:48:41.145003 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 13 20:48:41 crc kubenswrapper[4790]: I0313 20:48:41.640795 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 13 20:48:42 crc kubenswrapper[4790]: I0313 20:48:42.566165 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7f0237c2-5c72-4776-9226-67244abca8dd","Type":"ContainerStarted","Data":"abc967605bc85a141593aed19b10621514cf7f5701c183acf4e6e90950ad49e9"} Mar 13 20:48:43 crc kubenswrapper[4790]: I0313 20:48:43.180955 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.929964 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-798495789f-5fvw5"] Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.939417 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.942126 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.942247 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.942416 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 13 20:48:44 crc kubenswrapper[4790]: I0313 20:48:44.948603 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-798495789f-5fvw5"] Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039202 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-run-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039336 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-config-data\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-combined-ca-bundle\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039426 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5gjn\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-kube-api-access-z5gjn\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-log-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039580 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-public-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039620 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-etc-swift\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.039662 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-internal-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141038 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-config-data\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-combined-ca-bundle\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141139 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5gjn\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-kube-api-access-z5gjn\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141170 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-log-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141208 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-public-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-etc-swift\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141276 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-internal-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-run-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.141623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-log-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.142015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d498924-f84f-48aa-b971-b58cbea48295-run-httpd\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.149565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-public-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.149565 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-internal-tls-certs\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.152232 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-combined-ca-bundle\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.153031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-etc-swift\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.158272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d498924-f84f-48aa-b971-b58cbea48295-config-data\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.164951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5gjn\" (UniqueName: \"kubernetes.io/projected/7d498924-f84f-48aa-b971-b58cbea48295-kube-api-access-z5gjn\") pod \"swift-proxy-798495789f-5fvw5\" (UID: \"7d498924-f84f-48aa-b971-b58cbea48295\") " pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.279010 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:45 crc kubenswrapper[4790]: I0313 20:48:45.949787 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-798495789f-5fvw5"] Mar 13 20:48:45 crc kubenswrapper[4790]: W0313 20:48:45.997323 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d498924_f84f_48aa_b971_b58cbea48295.slice/crio-e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a WatchSource:0}: Error finding container e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a: Status 404 returned error can't find the container with id e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.535541 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592296 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592373 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592422 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592488 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592543 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592660 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.592705 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") pod \"1abdfade-817b-4659-b8be-48bb516fb866\" (UID: \"1abdfade-817b-4659-b8be-48bb516fb866\") " Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.594478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.595323 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.603186 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79" (OuterVolumeSpecName: "kube-api-access-bdq79") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "kube-api-access-bdq79". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.603494 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts" (OuterVolumeSpecName: "scripts") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.625838 4790 generic.go:334] "Generic (PLEG): container finished" podID="1abdfade-817b-4659-b8be-48bb516fb866" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" exitCode=137 Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.626141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.630980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.631055 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1abdfade-817b-4659-b8be-48bb516fb866","Type":"ContainerDied","Data":"3240e14626b1a27ca0670703b5e37bc443567929afbc6effd03a1f681e6eeda6"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.631079 4790 scope.go:117] "RemoveContainer" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.641135 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-798495789f-5fvw5" event={"ID":"7d498924-f84f-48aa-b971-b58cbea48295","Type":"ContainerStarted","Data":"b2473dc84a21b361e755e4dc3b18c75ade1a917e3696e1b6cf9c8ebc083cbfcf"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.641462 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-798495789f-5fvw5" event={"ID":"7d498924-f84f-48aa-b971-b58cbea48295","Type":"ContainerStarted","Data":"e864879895a5f488df2e8a0cdbee31bf6e25161fd4a9be68385c40769776f87a"} Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.679479 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.680366 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.694886 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695090 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695104 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1abdfade-817b-4659-b8be-48bb516fb866-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695117 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695127 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdq79\" (UniqueName: \"kubernetes.io/projected/1abdfade-817b-4659-b8be-48bb516fb866-kube-api-access-bdq79\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.695138 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.696519 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data" (OuterVolumeSpecName: "config-data") pod "1abdfade-817b-4659-b8be-48bb516fb866" (UID: "1abdfade-817b-4659-b8be-48bb516fb866"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.796594 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1abdfade-817b-4659-b8be-48bb516fb866-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.988193 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:46 crc kubenswrapper[4790]: I0313 20:48:46.999745 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.029410 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:47 crc kubenswrapper[4790]: E0313 20:48:47.030347 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.030393 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" Mar 13 20:48:47 crc kubenswrapper[4790]: E0313 20:48:47.030438 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.030448 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" Mar 13 20:48:47 crc kubenswrapper[4790]: E0313 20:48:47.030522 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.030534 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.033807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="proxy-httpd" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.033873 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="ceilometer-notification-agent" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.033903 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1abdfade-817b-4659-b8be-48bb516fb866" containerName="sg-core" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.035948 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.040791 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.042010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.046846 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104173 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104330 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104357 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104443 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104542 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104591 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.104645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.206836 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.206929 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.206961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207023 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207172 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207199 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.207524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.212033 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.212069 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.215631 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.224743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.225091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"ceilometer-0\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.361790 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:48:47 crc kubenswrapper[4790]: I0313 20:48:47.689527 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1abdfade-817b-4659-b8be-48bb516fb866" path="/var/lib/kubelet/pods/1abdfade-817b-4659-b8be-48bb516fb866/volumes" Mar 13 20:48:49 crc kubenswrapper[4790]: I0313 20:48:49.952149 4790 scope.go:117] "RemoveContainer" containerID="31ce3becbe5f9fc73efb71d7c9c70a67bb2549c4e27e76481e3678501a4317cf" Mar 13 20:48:50 crc kubenswrapper[4790]: I0313 20:48:50.429069 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-77655f674d-4r7h4" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.151:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.151:8443: connect: connection refused" Mar 13 20:48:50 crc kubenswrapper[4790]: I0313 20:48:50.429447 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.872133 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77f687ff4f-d7b7z" Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.971850 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.972392 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fc7fb5bf6-ctr9l" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" containerID="cri-o://449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09" gracePeriod=30 Mar 13 20:48:51 crc kubenswrapper[4790]: I0313 20:48:51.972804 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-5fc7fb5bf6-ctr9l" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" containerID="cri-o://8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e" gracePeriod=30 Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.016693 4790 scope.go:117] "RemoveContainer" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.127547 4790 scope.go:117] "RemoveContainer" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.236845 4790 scope.go:117] "RemoveContainer" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" Mar 13 20:48:52 crc kubenswrapper[4790]: E0313 20:48:52.238471 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4\": container with ID starting with 8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4 not found: ID does not exist" containerID="8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.238535 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4"} err="failed to get container status \"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4\": rpc error: code = NotFound desc = could not find container \"8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4\": container with ID starting with 8f9a092ab57a4d1dfa6ebfee1a22457c36426a31c07ee0a8e3924539cc642eb4 not found: ID does not exist" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.238556 4790 scope.go:117] "RemoveContainer" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" Mar 13 20:48:52 crc kubenswrapper[4790]: E0313 20:48:52.239880 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486\": container with ID starting with fe7297aab5981431006e363000146624b164562815f098000374d6b910719486 not found: ID does not exist" containerID="fe7297aab5981431006e363000146624b164562815f098000374d6b910719486" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.239910 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486"} err="failed to get container status \"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486\": rpc error: code = NotFound desc = could not find container \"fe7297aab5981431006e363000146624b164562815f098000374d6b910719486\": container with ID starting with fe7297aab5981431006e363000146624b164562815f098000374d6b910719486 not found: ID does not exist" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.239930 4790 scope.go:117] "RemoveContainer" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" Mar 13 20:48:52 crc kubenswrapper[4790]: E0313 20:48:52.240242 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c\": container with ID starting with 0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c not found: ID does not exist" containerID="0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.240289 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c"} err="failed to get container status \"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c\": rpc error: code = NotFound desc = could not find container \"0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c\": container with ID starting with 0ae139c8f65924e4576ab0ac2f14e878f27dea8835f5511c8c80ac90638c4c0c not found: ID does not exist" Mar 13 20:48:52 crc kubenswrapper[4790]: W0313 20:48:52.637984 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddc306890_4355_4f40_abc0_11753b34d120.slice/crio-024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0 WatchSource:0}: Error finding container 024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0: Status 404 returned error can't find the container with id 024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0 Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.638616 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.772772 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.775184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7f0237c2-5c72-4776-9226-67244abca8dd","Type":"ContainerStarted","Data":"ec8012539c5fb52c716a3829d23983c3bc81706c32675695f09bf5f4ba8e2727"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.778342 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-798495789f-5fvw5" event={"ID":"7d498924-f84f-48aa-b971-b58cbea48295","Type":"ContainerStarted","Data":"75d271d47d12592ed78b17cd53ef3de64fc116a56e2a8f41c8439584d19b2524"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.778530 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.781079 4790 generic.go:334] "Generic (PLEG): container finished" podID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerID="8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e" exitCode=0 Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.781145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerDied","Data":"8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e"} Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.798360 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.229869181 podStartE2EDuration="12.798338369s" podCreationTimestamp="2026-03-13 20:48:40 +0000 UTC" firstStartedPulling="2026-03-13 20:48:41.648028821 +0000 UTC m=+1252.669144713" lastFinishedPulling="2026-03-13 20:48:52.216498 +0000 UTC m=+1263.237613901" observedRunningTime="2026-03-13 20:48:52.789805176 +0000 UTC m=+1263.810921077" watchObservedRunningTime="2026-03-13 20:48:52.798338369 +0000 UTC m=+1263.819454260" Mar 13 20:48:52 crc kubenswrapper[4790]: I0313 20:48:52.816734 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-798495789f-5fvw5" podStartSLOduration=8.816719591 podStartE2EDuration="8.816719591s" podCreationTimestamp="2026-03-13 20:48:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:48:52.812215718 +0000 UTC m=+1263.833331619" watchObservedRunningTime="2026-03-13 20:48:52.816719591 +0000 UTC m=+1263.837835482" Mar 13 20:48:53 crc kubenswrapper[4790]: I0313 20:48:53.791350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22"} Mar 13 20:48:53 crc kubenswrapper[4790]: I0313 20:48:53.791612 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.142603 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.248176 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.248489 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" containerID="cri-o://4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d" gracePeriod=30 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.248553 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" containerID="cri-o://96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e" gracePeriod=30 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.819749 4790 generic.go:334] "Generic (PLEG): container finished" podID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerID="449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09" exitCode=0 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.819933 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerDied","Data":"449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09"} Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.825695 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerID="4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d" exitCode=143 Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.825770 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerDied","Data":"4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d"} Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.831207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9"} Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.862834 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878242 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878415 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878490 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878564 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.878606 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.889203 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26" (OuterVolumeSpecName: "kube-api-access-xcl26") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "kube-api-access-xcl26". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.891264 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.938862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config" (OuterVolumeSpecName: "config") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.955633 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.991408 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.992644 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") pod \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\" (UID: \"96f53d5c-8b27-4810-a760-f7c9a4ee567b\") " Mar 13 20:48:54 crc kubenswrapper[4790]: W0313 20:48:54.992812 4790 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/96f53d5c-8b27-4810-a760-f7c9a4ee567b/volumes/kubernetes.io~secret/ovndb-tls-certs Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.992844 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "96f53d5c-8b27-4810-a760-f7c9a4ee567b" (UID: "96f53d5c-8b27-4810-a760-f7c9a4ee567b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993461 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993493 4790 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993505 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993520 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/96f53d5c-8b27-4810-a760-f7c9a4ee567b-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:54 crc kubenswrapper[4790]: I0313 20:48:54.993531 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcl26\" (UniqueName: \"kubernetes.io/projected/96f53d5c-8b27-4810-a760-f7c9a4ee567b-kube-api-access-xcl26\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.610600 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.776642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.804901 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.804976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805011 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805216 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805264 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.805317 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") pod \"596ad32f-9087-4dbe-a495-8bf03200cd60\" (UID: \"596ad32f-9087-4dbe-a495-8bf03200cd60\") " Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.807317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs" (OuterVolumeSpecName: "logs") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.811522 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k" (OuterVolumeSpecName: "kube-api-access-l6t9k") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "kube-api-access-l6t9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.814855 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.835095 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data" (OuterVolumeSpecName: "config-data") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.846300 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts" (OuterVolumeSpecName: "scripts") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.846664 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5fc7fb5bf6-ctr9l" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.846842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5fc7fb5bf6-ctr9l" event={"ID":"96f53d5c-8b27-4810-a760-f7c9a4ee567b","Type":"ContainerDied","Data":"244df6a639eefd00639382cbb0a24020174298bb705bcc4beb3a1a60874bb9a0"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.847102 4790 scope.go:117] "RemoveContainer" containerID="8b2e29cd1d39fc375a2c87170b615afd8165699c2feb129d8fe6f2064e48bc4e" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851587 4790 generic.go:334] "Generic (PLEG): container finished" podID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" exitCode=137 Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851672 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerDied","Data":"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-77655f674d-4r7h4" event={"ID":"596ad32f-9087-4dbe-a495-8bf03200cd60","Type":"ContainerDied","Data":"32071f4748bdbdbbb2169f1b2a9fc194d9a40accb2c6784c59874d08e8b9f3b6"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.851763 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-77655f674d-4r7h4" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.857512 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.867145 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38"} Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.886891 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.894669 4790 scope.go:117] "RemoveContainer" containerID="449a35d79f426767909c30ff57f1a03c65663f3b50a2fecaf21aa36b537c5d09" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.898495 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "596ad32f-9087-4dbe-a495-8bf03200cd60" (UID: "596ad32f-9087-4dbe-a495-8bf03200cd60"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.904877 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5fc7fb5bf6-ctr9l"] Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907594 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907627 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907637 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/596ad32f-9087-4dbe-a495-8bf03200cd60-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907646 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/596ad32f-9087-4dbe-a495-8bf03200cd60-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907656 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6t9k\" (UniqueName: \"kubernetes.io/projected/596ad32f-9087-4dbe-a495-8bf03200cd60-kube-api-access-l6t9k\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907666 4790 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.907675 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/596ad32f-9087-4dbe-a495-8bf03200cd60-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:55 crc kubenswrapper[4790]: I0313 20:48:55.912184 4790 scope.go:117] "RemoveContainer" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.075114 4790 scope.go:117] "RemoveContainer" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.097087 4790 scope.go:117] "RemoveContainer" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" Mar 13 20:48:56 crc kubenswrapper[4790]: E0313 20:48:56.097686 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69\": container with ID starting with 59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69 not found: ID does not exist" containerID="59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.097720 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69"} err="failed to get container status \"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69\": rpc error: code = NotFound desc = could not find container \"59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69\": container with ID starting with 59f1e88ce1a2ada62792c6e908712145243405721eadf9a69ef6c9d220648d69 not found: ID does not exist" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.097741 4790 scope.go:117] "RemoveContainer" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" Mar 13 20:48:56 crc kubenswrapper[4790]: E0313 20:48:56.098164 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483\": container with ID starting with 75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483 not found: ID does not exist" containerID="75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.098238 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483"} err="failed to get container status \"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483\": rpc error: code = NotFound desc = could not find container \"75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483\": container with ID starting with 75b421cd9eb05fbfd7c841210ae03842b5f07370e5daa2526e6d456868677483 not found: ID does not exist" Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.185393 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:48:56 crc kubenswrapper[4790]: I0313 20:48:56.199327 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-77655f674d-4r7h4"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.582920 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583639 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583655 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583669 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583674 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583685 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583691 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" Mar 13 20:48:57 crc kubenswrapper[4790]: E0313 20:48:57.583707 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583713 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583872 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583886 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-api" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583896 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" containerName="neutron-httpd" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.583911 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" containerName="horizon-log" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.584507 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.595082 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.654177 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.654341 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.673825 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="596ad32f-9087-4dbe-a495-8bf03200cd60" path="/var/lib/kubelet/pods/596ad32f-9087-4dbe-a495-8bf03200cd60/volumes" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.674419 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96f53d5c-8b27-4810-a760-f7c9a4ee567b" path="/var/lib/kubelet/pods/96f53d5c-8b27-4810-a760-f7c9a4ee567b/volumes" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.689155 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.690545 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.692746 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.704571 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.706337 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.712953 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.721788 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757247 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.757901 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.767997 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.788013 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.792722 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.795833 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"nova-api-db-create-jjv8c\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.801406 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863737 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863797 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863859 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.863927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.864670 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.864869 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.884002 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"nova-api-fe05-account-create-update-dwwd8\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.903169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.905468 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.905570 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"nova-cell0-db-create-kq55v\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.907134 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.911231 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerStarted","Data":"dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99"} Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915790 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" containerID="cri-o://e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915882 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" containerID="cri-o://427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915884 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" containerID="cri-o://c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915940 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" containerID="cri-o://dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99" gracePeriod=30 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.915899 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.929908 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerID="96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e" exitCode=0 Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.929992 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerDied","Data":"96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e"} Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.937139 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.963726 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=7.915989563 podStartE2EDuration="11.963580989s" podCreationTimestamp="2026-03-13 20:48:46 +0000 UTC" firstStartedPulling="2026-03-13 20:48:52.640350321 +0000 UTC m=+1263.661466212" lastFinishedPulling="2026-03-13 20:48:56.687941747 +0000 UTC m=+1267.709057638" observedRunningTime="2026-03-13 20:48:57.952074305 +0000 UTC m=+1268.973190206" watchObservedRunningTime="2026-03-13 20:48:57.963580989 +0000 UTC m=+1268.984696880" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969761 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.969827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:57 crc kubenswrapper[4790]: I0313 20:48:57.970593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.002465 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"nova-cell1-db-create-lrnph\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.008104 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.034300 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.076902 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.076986 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.085484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.100699 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.102189 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.106663 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.110822 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.120057 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"nova-cell0-926f-account-create-update-nnl2f\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.152240 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.178931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.179087 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.280214 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.280608 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.281251 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.327549 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"nova-cell1-8821-account-create-update-l6ffx\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.343943 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.344220 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" containerID="cri-o://6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6" gracePeriod=30 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.344817 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" containerID="cri-o://5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343" gracePeriod=30 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.368066 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.424343 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.629281 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:48:58 crc kubenswrapper[4790]: W0313 20:48:58.631731 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c861107_6a1d_49f7_bc63_b95008ee5ddc.slice/crio-42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe WatchSource:0}: Error finding container 42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe: Status 404 returned error can't find the container with id 42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.721077 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803751 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803811 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803881 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.803914 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804074 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804165 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804208 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.804448 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") pod \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\" (UID: \"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2\") " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.810115 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs" (OuterVolumeSpecName: "logs") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.810434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.825421 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.825547 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts" (OuterVolumeSpecName: "scripts") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.836670 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz" (OuterVolumeSpecName: "kube-api-access-j2cwz") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "kube-api-access-j2cwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.889497 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907249 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907278 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2cwz\" (UniqueName: \"kubernetes.io/projected/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-kube-api-access-j2cwz\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907287 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907307 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.907318 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.916956 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.948347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kq55v" event={"ID":"86c0a379-8f0b-4414-863c-eaed0745ce2d","Type":"ContainerStarted","Data":"e654e8a2a2f4ddba0cbe74c5ccd432a4c0611cace807bcb47d1cab56837685c1"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986127 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99" exitCode=0 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986173 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38" exitCode=2 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986204 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9" exitCode=0 Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986264 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986296 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.986311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9"} Mar 13 20:48:58 crc kubenswrapper[4790]: I0313 20:48:58.992093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe05-account-create-update-dwwd8" event={"ID":"1a4ef124-b4dd-43df-bdfb-97c65685977c","Type":"ContainerStarted","Data":"a482d341a0588b1286360a3c7bf6118a7d5c154aafca78a5c2aa6c70a4917ca8"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.001553 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.003761 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.009477 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.010335 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.010404 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.010576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjv8c" event={"ID":"9c861107-6a1d-49f7-bc63-b95008ee5ddc","Type":"ContainerStarted","Data":"42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.013081 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2","Type":"ContainerDied","Data":"ba875bd508f6a929ed72f4f60e05be777631ac626ac3eec05ada1ba30d28bfc5"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.013101 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.013131 4790 scope.go:117] "RemoveContainer" containerID="96ac0a7c5978eeb8c0f3a4fc52a8593d87b076ec513b819bbd3b74106a8ca70e" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.024605 4790 generic.go:334] "Generic (PLEG): container finished" podID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerID="6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6" exitCode=143 Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.024666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerDied","Data":"6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6"} Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.056546 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data" (OuterVolumeSpecName: "config-data") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.078704 4790 scope.go:117] "RemoveContainer" containerID="4349a4319d7d7f3a7af4e8d8122ef2003198a82dbec9b58b843ef6769bc7f33d" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.097327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" (UID: "6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.119268 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.119306 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.135721 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.146091 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:48:59 crc kubenswrapper[4790]: W0313 20:48:59.157344 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod00f4f78b_ccfb_4413_9a81_d5b461a5e319.slice/crio-69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809 WatchSource:0}: Error finding container 69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809: Status 404 returned error can't find the container with id 69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809 Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.414056 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.452550 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.465240 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: E0313 20:48:59.465784 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.465798 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" Mar 13 20:48:59 crc kubenswrapper[4790]: E0313 20:48:59.465815 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.465820 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.466013 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-httpd" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.466026 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" containerName="glance-log" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.473539 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.476090 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.476412 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.486786 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536425 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536514 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6rb\" (UniqueName: \"kubernetes.io/projected/8c1c1847-eb77-4170-8034-e58ba375ad84-kube-api-access-rn6rb\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536590 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536720 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536742 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-logs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536767 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.536799 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638814 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-logs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638885 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638923 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.638961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639004 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6rb\" (UniqueName: \"kubernetes.io/projected/8c1c1847-eb77-4170-8034-e58ba375ad84-kube-api-access-rn6rb\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639085 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.639561 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.640325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.640674 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c1c1847-eb77-4170-8034-e58ba375ad84-logs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.649484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.649952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-scripts\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.650091 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.651408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c1c1847-eb77-4170-8034-e58ba375ad84-config-data\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.661414 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6rb\" (UniqueName: \"kubernetes.io/projected/8c1c1847-eb77-4170-8034-e58ba375ad84-kube-api-access-rn6rb\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.684542 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2" path="/var/lib/kubelet/pods/6ed0eb88-051d-48ad-a934-3cfb7dbcd0f2/volumes" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.686811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"8c1c1847-eb77-4170-8034-e58ba375ad84\") " pod="openstack/glance-default-external-api-0" Mar 13 20:48:59 crc kubenswrapper[4790]: I0313 20:48:59.929261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.039629 4790 generic.go:334] "Generic (PLEG): container finished" podID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerID="749c82e4067fc52a2714101b9401b4c82b0470e8a2bd0821a82732111bf3a2ae" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.039858 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kq55v" event={"ID":"86c0a379-8f0b-4414-863c-eaed0745ce2d","Type":"ContainerDied","Data":"749c82e4067fc52a2714101b9401b4c82b0470e8a2bd0821a82732111bf3a2ae"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.042195 4790 generic.go:334] "Generic (PLEG): container finished" podID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerID="ac99b8592ceb7c3e6a37fbb0c9de0300f9c9ee5a2b4807abffe2d2ed52e8fe04" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.042256 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe05-account-create-update-dwwd8" event={"ID":"1a4ef124-b4dd-43df-bdfb-97c65685977c","Type":"ContainerDied","Data":"ac99b8592ceb7c3e6a37fbb0c9de0300f9c9ee5a2b4807abffe2d2ed52e8fe04"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.044766 4790 generic.go:334] "Generic (PLEG): container finished" podID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerID="c0e58f35f1d7b48efbdbbc91a297aa591c210bb71e60644cb81c14c40a9e45cb" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.044841 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjv8c" event={"ID":"9c861107-6a1d-49f7-bc63-b95008ee5ddc","Type":"ContainerDied","Data":"c0e58f35f1d7b48efbdbbc91a297aa591c210bb71e60644cb81c14c40a9e45cb"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.080131 4790 generic.go:334] "Generic (PLEG): container finished" podID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerID="ecda3f7499b0977157d22e381725d43a5571bfd9425676b723008c4d5d967330" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.080286 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" event={"ID":"536b2b85-21d0-47ba-8825-998dcb7b0058","Type":"ContainerDied","Data":"ecda3f7499b0977157d22e381725d43a5571bfd9425676b723008c4d5d967330"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.080353 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" event={"ID":"536b2b85-21d0-47ba-8825-998dcb7b0058","Type":"ContainerStarted","Data":"a28205db680e41155b7e2c6e7dd8da8bc4d10a1ee2a526bd3778cf937317c277"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.125581 4790 generic.go:334] "Generic (PLEG): container finished" podID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerID="2532c9c9471a4f51d2c72742172102590d5f8b86465110fbcffff19c31b75b68" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.125647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" event={"ID":"00f4f78b-ccfb-4413-9a81-d5b461a5e319","Type":"ContainerDied","Data":"2532c9c9471a4f51d2c72742172102590d5f8b86465110fbcffff19c31b75b68"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.125685 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" event={"ID":"00f4f78b-ccfb-4413-9a81-d5b461a5e319","Type":"ContainerStarted","Data":"69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.128105 4790 generic.go:334] "Generic (PLEG): container finished" podID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerID="15f4fd3d9e2092ff500a17b34ac7be646f532a2e8275aea162c7ec8133dbdbed" exitCode=0 Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.128147 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lrnph" event={"ID":"dcc0f61e-f0ce-4443-9eec-0488ff92b388","Type":"ContainerDied","Data":"15f4fd3d9e2092ff500a17b34ac7be646f532a2e8275aea162c7ec8133dbdbed"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.128189 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lrnph" event={"ID":"dcc0f61e-f0ce-4443-9eec-0488ff92b388","Type":"ContainerStarted","Data":"9fc00c50ad54c36a8895a69bc200716f7b29a72c8e6c77a86f7e0ef0f4300cd7"} Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.314228 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-798495789f-5fvw5" Mar 13 20:49:00 crc kubenswrapper[4790]: I0313 20:49:00.475831 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.139071 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c1c1847-eb77-4170-8034-e58ba375ad84","Type":"ContainerStarted","Data":"212ef3834c313c563c3500c5f8a4cd559dfe0819e1a567dfece2c1a2feab01d6"} Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.481436 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.586037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") pod \"1a4ef124-b4dd-43df-bdfb-97c65685977c\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.586799 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") pod \"1a4ef124-b4dd-43df-bdfb-97c65685977c\" (UID: \"1a4ef124-b4dd-43df-bdfb-97c65685977c\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.588033 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a4ef124-b4dd-43df-bdfb-97c65685977c" (UID: "1a4ef124-b4dd-43df-bdfb-97c65685977c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.599847 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9" (OuterVolumeSpecName: "kube-api-access-fn4b9") pod "1a4ef124-b4dd-43df-bdfb-97c65685977c" (UID: "1a4ef124-b4dd-43df-bdfb-97c65685977c"). InnerVolumeSpecName "kube-api-access-fn4b9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.689721 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a4ef124-b4dd-43df-bdfb-97c65685977c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.689763 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fn4b9\" (UniqueName: \"kubernetes.io/projected/1a4ef124-b4dd-43df-bdfb-97c65685977c-kube-api-access-fn4b9\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.865747 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.867105 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-854ddc4bd-b4ws7" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.883318 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.893548 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") pod \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.893626 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") pod \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\" (UID: \"9c861107-6a1d-49f7-bc63-b95008ee5ddc\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.896301 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c861107-6a1d-49f7-bc63-b95008ee5ddc" (UID: "9c861107-6a1d-49f7-bc63-b95008ee5ddc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.900455 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42" (OuterVolumeSpecName: "kube-api-access-6hr42") pod "9c861107-6a1d-49f7-bc63-b95008ee5ddc" (UID: "9c861107-6a1d-49f7-bc63-b95008ee5ddc"). InnerVolumeSpecName "kube-api-access-6hr42". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.902094 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.990898 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995259 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") pod \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") pod \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\" (UID: \"dcc0f61e-f0ce-4443-9eec-0488ff92b388\") " Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995855 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hr42\" (UniqueName: \"kubernetes.io/projected/9c861107-6a1d-49f7-bc63-b95008ee5ddc-kube-api-access-6hr42\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.995875 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c861107-6a1d-49f7-bc63-b95008ee5ddc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:01 crc kubenswrapper[4790]: I0313 20:49:01.996356 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dcc0f61e-f0ce-4443-9eec-0488ff92b388" (UID: "dcc0f61e-f0ce-4443-9eec-0488ff92b388"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.003230 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cd9b448d6-w8fcr" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" containerID="cri-o://f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd" gracePeriod=30 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.003476 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cd9b448d6-w8fcr" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" containerID="cri-o://963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b" gracePeriod=30 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.008729 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs" (OuterVolumeSpecName: "kube-api-access-ddvrs") pod "dcc0f61e-f0ce-4443-9eec-0488ff92b388" (UID: "dcc0f61e-f0ce-4443-9eec-0488ff92b388"). InnerVolumeSpecName "kube-api-access-ddvrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.033183 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.039630 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.097798 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddvrs\" (UniqueName: \"kubernetes.io/projected/dcc0f61e-f0ce-4443-9eec-0488ff92b388-kube-api-access-ddvrs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.097832 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dcc0f61e-f0ce-4443-9eec-0488ff92b388-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.102554 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.166856 4790 generic.go:334] "Generic (PLEG): container finished" podID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerID="f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd" exitCode=143 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.166928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerDied","Data":"f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.174128 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c1c1847-eb77-4170-8034-e58ba375ad84","Type":"ContainerStarted","Data":"fbaa139db9b0d8e939decfd665a0d823d372df7d7447765076205b24cf476904"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.187041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-jjv8c" event={"ID":"9c861107-6a1d-49f7-bc63-b95008ee5ddc","Type":"ContainerDied","Data":"42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.187093 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42edcd948c944d64b36bfee1b171b205451d925a6df9a6b7c585b9771af386fe" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.187171 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-jjv8c" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202450 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") pod \"536b2b85-21d0-47ba-8825-998dcb7b0058\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202570 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") pod \"536b2b85-21d0-47ba-8825-998dcb7b0058\" (UID: \"536b2b85-21d0-47ba-8825-998dcb7b0058\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202593 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") pod \"86c0a379-8f0b-4414-863c-eaed0745ce2d\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202621 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") pod \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") pod \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\" (UID: \"00f4f78b-ccfb-4413-9a81-d5b461a5e319\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.202690 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") pod \"86c0a379-8f0b-4414-863c-eaed0745ce2d\" (UID: \"86c0a379-8f0b-4414-863c-eaed0745ce2d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.203730 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "536b2b85-21d0-47ba-8825-998dcb7b0058" (UID: "536b2b85-21d0-47ba-8825-998dcb7b0058"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.203917 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86c0a379-8f0b-4414-863c-eaed0745ce2d" (UID: "86c0a379-8f0b-4414-863c-eaed0745ce2d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.205008 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "00f4f78b-ccfb-4413-9a81-d5b461a5e319" (UID: "00f4f78b-ccfb-4413-9a81-d5b461a5e319"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.208865 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9" (OuterVolumeSpecName: "kube-api-access-q4vj9") pod "86c0a379-8f0b-4414-863c-eaed0745ce2d" (UID: "86c0a379-8f0b-4414-863c-eaed0745ce2d"). InnerVolumeSpecName "kube-api-access-q4vj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.212886 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5" (OuterVolumeSpecName: "kube-api-access-qdhw5") pod "536b2b85-21d0-47ba-8825-998dcb7b0058" (UID: "536b2b85-21d0-47ba-8825-998dcb7b0058"). InnerVolumeSpecName "kube-api-access-qdhw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.215025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq" (OuterVolumeSpecName: "kube-api-access-jg9bq") pod "00f4f78b-ccfb-4413-9a81-d5b461a5e319" (UID: "00f4f78b-ccfb-4413-9a81-d5b461a5e319"). InnerVolumeSpecName "kube-api-access-jg9bq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.220516 4790 generic.go:334] "Generic (PLEG): container finished" podID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerID="5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343" exitCode=0 Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.220576 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerDied","Data":"5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.232531 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-lrnph" event={"ID":"dcc0f61e-f0ce-4443-9eec-0488ff92b388","Type":"ContainerDied","Data":"9fc00c50ad54c36a8895a69bc200716f7b29a72c8e6c77a86f7e0ef0f4300cd7"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.232576 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9fc00c50ad54c36a8895a69bc200716f7b29a72c8e6c77a86f7e0ef0f4300cd7" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.232646 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-lrnph" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.238114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-kq55v" event={"ID":"86c0a379-8f0b-4414-863c-eaed0745ce2d","Type":"ContainerDied","Data":"e654e8a2a2f4ddba0cbe74c5ccd432a4c0611cace807bcb47d1cab56837685c1"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.238159 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e654e8a2a2f4ddba0cbe74c5ccd432a4c0611cace807bcb47d1cab56837685c1" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.238226 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-kq55v" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.254097 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-fe05-account-create-update-dwwd8" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.254107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-fe05-account-create-update-dwwd8" event={"ID":"1a4ef124-b4dd-43df-bdfb-97c65685977c","Type":"ContainerDied","Data":"a482d341a0588b1286360a3c7bf6118a7d5c154aafca78a5c2aa6c70a4917ca8"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.254149 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a482d341a0588b1286360a3c7bf6118a7d5c154aafca78a5c2aa6c70a4917ca8" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.261490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" event={"ID":"536b2b85-21d0-47ba-8825-998dcb7b0058","Type":"ContainerDied","Data":"a28205db680e41155b7e2c6e7dd8da8bc4d10a1ee2a526bd3778cf937317c277"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.261537 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a28205db680e41155b7e2c6e7dd8da8bc4d10a1ee2a526bd3778cf937317c277" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.261593 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-926f-account-create-update-nnl2f" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.280496 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.280528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-8821-account-create-update-l6ffx" event={"ID":"00f4f78b-ccfb-4413-9a81-d5b461a5e319","Type":"ContainerDied","Data":"69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809"} Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.280762 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69b6bff4250914cdb389db496c666bea50a463e2d929ddbb170528cbe4829809" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.301528 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305789 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/536b2b85-21d0-47ba-8825-998dcb7b0058-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305834 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdhw5\" (UniqueName: \"kubernetes.io/projected/536b2b85-21d0-47ba-8825-998dcb7b0058-kube-api-access-qdhw5\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305848 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4vj9\" (UniqueName: \"kubernetes.io/projected/86c0a379-8f0b-4414-863c-eaed0745ce2d-kube-api-access-q4vj9\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305858 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg9bq\" (UniqueName: \"kubernetes.io/projected/00f4f78b-ccfb-4413-9a81-d5b461a5e319-kube-api-access-jg9bq\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305870 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/00f4f78b-ccfb-4413-9a81-d5b461a5e319-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.305881 4790 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86c0a379-8f0b-4414-863c-eaed0745ce2d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.406946 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407256 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407453 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407521 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407618 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") pod \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\" (UID: \"773bad92-580e-4a9c-9ba5-eef9d8bbc40d\") " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.407950 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs" (OuterVolumeSpecName: "logs") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.408136 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.408827 4790 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.408852 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.413450 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "glance") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.414355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c" (OuterVolumeSpecName: "kube-api-access-qg82c") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "kube-api-access-qg82c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.418710 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts" (OuterVolumeSpecName: "scripts") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.467247 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.519265 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.521561 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg82c\" (UniqueName: \"kubernetes.io/projected/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-kube-api-access-qg82c\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.521677 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.522169 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.549023 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.551149 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.554528 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data" (OuterVolumeSpecName: "config-data") pod "773bad92-580e-4a9c-9ba5-eef9d8bbc40d" (UID: "773bad92-580e-4a9c-9ba5-eef9d8bbc40d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.625632 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.625678 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:02 crc kubenswrapper[4790]: I0313 20:49:02.625694 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/773bad92-580e-4a9c-9ba5-eef9d8bbc40d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.293753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8c1c1847-eb77-4170-8034-e58ba375ad84","Type":"ContainerStarted","Data":"7f1ce3af76cd7e08593479522074ad8373a74ed940c79b7138f7c00e89bb3da5"} Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.298638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"773bad92-580e-4a9c-9ba5-eef9d8bbc40d","Type":"ContainerDied","Data":"b2d041fbf6a68ca43a859ff33ee8b3f4522929d6bbc2ac451a4da91c437362dd"} Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.298691 4790 scope.go:117] "RemoveContainer" containerID="5ed056308fa78044710942e2c9cea38e859d820f8739efda67e4e603f99c4343" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.298819 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.310461 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc306890-4355-4f40-abc0-11753b34d120" containerID="e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22" exitCode=0 Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.310510 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22"} Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.339533 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.339509095 podStartE2EDuration="4.339509095s" podCreationTimestamp="2026-03-13 20:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:03.333655265 +0000 UTC m=+1274.354771156" watchObservedRunningTime="2026-03-13 20:49:03.339509095 +0000 UTC m=+1274.360624986" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.367124 4790 scope.go:117] "RemoveContainer" containerID="6f71db6d93e0c718a70afb3c8920d1131d779aebc23ca039251f6366967791e6" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.399441 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.410464 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.460612 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461026 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461046 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461056 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461062 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461071 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461077 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461089 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461097 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461107 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461113 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461128 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461133 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461143 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461150 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: E0313 20:49:03.461158 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461163 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461320 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461334 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-log" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461346 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461355 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461362 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" containerName="glance-httpd" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461399 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461409 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" containerName="mariadb-account-create-update" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.461428 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" containerName="mariadb-database-create" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.462586 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.470669 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.470820 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.470984 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.644824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.644904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.644959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645051 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645075 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645295 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw5hn\" (UniqueName: \"kubernetes.io/projected/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-kube-api-access-vw5hn\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645459 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.645617 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.671636 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="773bad92-580e-4a9c-9ba5-eef9d8bbc40d" path="/var/lib/kubelet/pods/773bad92-580e-4a9c-9ba5-eef9d8bbc40d/volumes" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.687944 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.747850 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.748249 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.748668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.748838 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749123 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749155 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749307 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw5hn\" (UniqueName: \"kubernetes.io/projected/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-kube-api-access-vw5hn\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.749501 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.750108 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.750611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.750951 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-logs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.756272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.758523 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.775312 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw5hn\" (UniqueName: \"kubernetes.io/projected/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-kube-api-access-vw5hn\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.775664 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.776638 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b5b10e44-e0ce-4568-b33c-dd9855d61fd7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.792078 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"glance-default-internal-api-0\" (UID: \"b5b10e44-e0ce-4568-b33c-dd9855d61fd7\") " pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852284 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852418 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852511 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852607 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852669 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852692 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") pod \"dc306890-4355-4f40-abc0-11753b34d120\" (UID: \"dc306890-4355-4f40-abc0-11753b34d120\") " Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.852845 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.853152 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.853434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.856169 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.856978 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts" (OuterVolumeSpecName: "scripts") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.860583 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9" (OuterVolumeSpecName: "kube-api-access-xntq9") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "kube-api-access-xntq9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.900478 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.950274 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956099 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956128 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xntq9\" (UniqueName: \"kubernetes.io/projected/dc306890-4355-4f40-abc0-11753b34d120-kube-api-access-xntq9\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956138 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dc306890-4355-4f40-abc0-11753b34d120-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956148 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:03 crc kubenswrapper[4790]: I0313 20:49:03.956160 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.009643 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data" (OuterVolumeSpecName: "config-data") pod "dc306890-4355-4f40-abc0-11753b34d120" (UID: "dc306890-4355-4f40-abc0-11753b34d120"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.058363 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc306890-4355-4f40-abc0-11753b34d120-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.324936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dc306890-4355-4f40-abc0-11753b34d120","Type":"ContainerDied","Data":"024f2c04fd4b7dc120d8e7fd5885a7dd4f3c3552f5f6d2b723fe33619d522ce0"} Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.324961 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.325003 4790 scope.go:117] "RemoveContainer" containerID="dd19d1f47a779bc0eefe03ea425f43911b8fa1cade11d838fb762ff00ee08c99" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.347011 4790 scope.go:117] "RemoveContainer" containerID="c12845e4c31624900dc62dbd98c8791cea0c6f646b09e9fa4c0931ec955bfc38" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.367465 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.378097 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.381514 4790 scope.go:117] "RemoveContainer" containerID="427f4e121de836625292dc58d9f628e241b940ecfee11bfb04fb92802c2bd9a9" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.388707 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389148 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389167 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389183 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389189 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389207 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389214 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" Mar 13 20:49:04 crc kubenswrapper[4790]: E0313 20:49:04.389239 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389245 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389414 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-notification-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389423 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="ceilometer-central-agent" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389440 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="sg-core" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.389451 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc306890-4355-4f40-abc0-11753b34d120" containerName="proxy-httpd" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.390993 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.395494 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.395897 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.404790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.409588 4790 scope.go:117] "RemoveContainer" containerID="e77c2f06981ff16ce8a83ea4cf86ff45903e943a0fce3443c6ecd2493b205d22" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.438954 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 13 20:49:04 crc kubenswrapper[4790]: W0313 20:49:04.441755 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5b10e44_e0ce_4568_b33c_dd9855d61fd7.slice/crio-e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017 WatchSource:0}: Error finding container e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017: Status 404 returned error can't find the container with id e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017 Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.568979 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569101 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569201 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569328 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569362 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.569787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671330 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671407 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671547 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671616 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671670 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671733 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.671873 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.673858 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.677787 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.679515 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.680249 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.703063 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.703501 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"ceilometer-0\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " pod="openstack/ceilometer-0" Mar 13 20:49:04 crc kubenswrapper[4790]: I0313 20:49:04.731574 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.210418 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.364884 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"78bf65debeefb94f3999e2e736301029c0c817dd3ba45f159bad72d2cdf7dd64"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.367723 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5b10e44-e0ce-4568-b33c-dd9855d61fd7","Type":"ContainerStarted","Data":"54ab8d08cdb0518dcdc67eeefbbb5b95f198e57d4e571b3e396b5da4783891d6"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.367754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5b10e44-e0ce-4568-b33c-dd9855d61fd7","Type":"ContainerStarted","Data":"e040dc473abb82760a60af4057ceac82d6175c820d5c14843f969c831eb47017"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.369287 4790 generic.go:334] "Generic (PLEG): container finished" podID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerID="963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b" exitCode=0 Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.369313 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerDied","Data":"963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b"} Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.562789 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.670642 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc306890-4355-4f40-abc0-11753b34d120" path="/var/lib/kubelet/pods/dc306890-4355-4f40-abc0-11753b34d120/volumes" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700040 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700131 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700152 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700286 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700317 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700397 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") pod \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\" (UID: \"88252e8c-21d9-402a-bab0-9f61b5eb3a70\") " Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.700970 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs" (OuterVolumeSpecName: "logs") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.706397 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts" (OuterVolumeSpecName: "scripts") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.719569 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj" (OuterVolumeSpecName: "kube-api-access-xl6nj") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "kube-api-access-xl6nj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.794640 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.798471 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data" (OuterVolumeSpecName: "config-data") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802312 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802351 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/88252e8c-21d9-402a-bab0-9f61b5eb3a70-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802367 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl6nj\" (UniqueName: \"kubernetes.io/projected/88252e8c-21d9-402a-bab0-9f61b5eb3a70-kube-api-access-xl6nj\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802399 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.802411 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.819590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.836577 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "88252e8c-21d9-402a-bab0-9f61b5eb3a70" (UID: "88252e8c-21d9-402a-bab0-9f61b5eb3a70"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.903788 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:05 crc kubenswrapper[4790]: I0313 20:49:05.903829 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/88252e8c-21d9-402a-bab0-9f61b5eb3a70-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.083719 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.389823 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cd9b448d6-w8fcr" event={"ID":"88252e8c-21d9-402a-bab0-9f61b5eb3a70","Type":"ContainerDied","Data":"5d216af4785a04f3e8536b6945d51a46024ad4cfced21083156e56a883fa3cab"} Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.389856 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cd9b448d6-w8fcr" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.391025 4790 scope.go:117] "RemoveContainer" containerID="963374fd67ec679caf00dd9bcc27806bbcfe92963bf34ed2f7df82c29a36025b" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.400971 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df"} Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.404195 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"b5b10e44-e0ce-4568-b33c-dd9855d61fd7","Type":"ContainerStarted","Data":"9a171e4fd7775bef92c105269dc6240f03221a50b1282faf0071e3ee05776514"} Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.422148 4790 scope.go:117] "RemoveContainer" containerID="f4b58b71174400c77e39715f7e4970a3816e119db2203fb9220a857f485f79bd" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.438349 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.438326703 podStartE2EDuration="3.438326703s" podCreationTimestamp="2026-03-13 20:49:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:06.430347305 +0000 UTC m=+1277.451463206" watchObservedRunningTime="2026-03-13 20:49:06.438326703 +0000 UTC m=+1277.459442594" Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.462775 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:49:06 crc kubenswrapper[4790]: I0313 20:49:06.472398 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6cd9b448d6-w8fcr"] Mar 13 20:49:07 crc kubenswrapper[4790]: I0313 20:49:07.416469 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3"} Mar 13 20:49:07 crc kubenswrapper[4790]: I0313 20:49:07.672299 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" path="/var/lib/kubelet/pods/88252e8c-21d9-402a-bab0-9f61b5eb3a70/volumes" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.179021 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:49:08 crc kubenswrapper[4790]: E0313 20:49:08.179776 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.179800 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" Mar 13 20:49:08 crc kubenswrapper[4790]: E0313 20:49:08.179822 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.179830 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.180071 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-api" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.180094 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="88252e8c-21d9-402a-bab0-9f61b5eb3a70" containerName="placement-log" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.180853 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.184049 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2wmdt" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.184268 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.184439 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.201347 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250470 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250595 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.250693 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.352538 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.353112 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.353142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.353215 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.359146 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.360069 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.371203 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.376126 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"nova-cell0-conductor-db-sync-82klj\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.428522 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2"} Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.498210 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:08 crc kubenswrapper[4790]: I0313 20:49:08.972089 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:49:08 crc kubenswrapper[4790]: W0313 20:49:08.976396 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod04b866fe_5d7d_46ab_9074_b93ddc7724f0.slice/crio-b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff WatchSource:0}: Error finding container b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff: Status 404 returned error can't find the container with id b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff Mar 13 20:49:09 crc kubenswrapper[4790]: I0313 20:49:09.442666 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerStarted","Data":"b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff"} Mar 13 20:49:09 crc kubenswrapper[4790]: I0313 20:49:09.930757 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:49:09 crc kubenswrapper[4790]: I0313 20:49:09.931108 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.000784 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.016249 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453647 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerStarted","Data":"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03"} Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453944 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453968 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.453945 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" containerID="cri-o://926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.454014 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" containerID="cri-o://786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.454041 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" containerID="cri-o://b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.454099 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" containerID="cri-o://327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" gracePeriod=30 Mar 13 20:49:10 crc kubenswrapper[4790]: I0313 20:49:10.482340 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.101172259 podStartE2EDuration="6.482319522s" podCreationTimestamp="2026-03-13 20:49:04 +0000 UTC" firstStartedPulling="2026-03-13 20:49:05.247147005 +0000 UTC m=+1276.268262896" lastFinishedPulling="2026-03-13 20:49:09.628294268 +0000 UTC m=+1280.649410159" observedRunningTime="2026-03-13 20:49:10.477698175 +0000 UTC m=+1281.498814096" watchObservedRunningTime="2026-03-13 20:49:10.482319522 +0000 UTC m=+1281.503435413" Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475550 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" exitCode=0 Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475842 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" exitCode=2 Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475851 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" exitCode=0 Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475625 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03"} Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475948 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2"} Mar 13 20:49:11 crc kubenswrapper[4790]: I0313 20:49:11.475960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3"} Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.027349 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132427 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132493 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132559 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132586 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132673 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.132716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") pod \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\" (UID: \"e2f1d856-14cc-48bb-b155-e74f8e5a9b56\") " Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.134437 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.134787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.138983 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts" (OuterVolumeSpecName: "scripts") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.139218 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j" (OuterVolumeSpecName: "kube-api-access-nfs8j") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "kube-api-access-nfs8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.167716 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.216304 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234645 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfs8j\" (UniqueName: \"kubernetes.io/projected/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-kube-api-access-nfs8j\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234689 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234700 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234712 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234722 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.234733 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.239054 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data" (OuterVolumeSpecName: "config-data") pod "e2f1d856-14cc-48bb-b155-e74f8e5a9b56" (UID: "e2f1d856-14cc-48bb-b155-e74f8e5a9b56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.335712 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2f1d856-14cc-48bb-b155-e74f8e5a9b56-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489453 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" exitCode=0 Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489504 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df"} Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e2f1d856-14cc-48bb-b155-e74f8e5a9b56","Type":"ContainerDied","Data":"78bf65debeefb94f3999e2e736301029c0c817dd3ba45f159bad72d2cdf7dd64"} Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489558 4790 scope.go:117] "RemoveContainer" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.489713 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.528579 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.549578 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.566772 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567132 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567144 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567166 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567172 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567181 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567187 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: E0313 20:49:12.567205 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567211 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567974 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-central-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.567996 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="proxy-httpd" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.568008 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="sg-core" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.568030 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" containerName="ceilometer-notification-agent" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.569988 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.576290 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577480 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577508 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577575 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.577854 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.617911 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746477 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746571 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746610 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746664 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746705 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746838 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.746862 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848200 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848552 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848824 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.848926 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.849028 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.849133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.849692 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.850581 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.854941 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.855466 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.856261 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.856827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.871160 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"ceilometer-0\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " pod="openstack/ceilometer-0" Mar 13 20:49:12 crc kubenswrapper[4790]: I0313 20:49:12.897646 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.672683 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f1d856-14cc-48bb-b155-e74f8e5a9b56" path="/var/lib/kubelet/pods/e2f1d856-14cc-48bb-b155-e74f8e5a9b56/volumes" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.856693 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.856768 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.897511 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:13 crc kubenswrapper[4790]: I0313 20:49:13.908006 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:14 crc kubenswrapper[4790]: I0313 20:49:14.523426 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:14 crc kubenswrapper[4790]: I0313 20:49:14.523785 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:14 crc kubenswrapper[4790]: I0313 20:49:14.992325 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:16 crc kubenswrapper[4790]: I0313 20:49:16.659222 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:16 crc kubenswrapper[4790]: I0313 20:49:16.659661 4790 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 20:49:16 crc kubenswrapper[4790]: I0313 20:49:16.674918 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.385083 4790 scope.go:117] "RemoveContainer" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.460985 4790 scope.go:117] "RemoveContainer" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.599607 4790 scope.go:117] "RemoveContainer" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.636539 4790 scope.go:117] "RemoveContainer" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.637039 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03\": container with ID starting with 327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03 not found: ID does not exist" containerID="327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637097 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03"} err="failed to get container status \"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03\": rpc error: code = NotFound desc = could not find container \"327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03\": container with ID starting with 327e3e03b2dfb71920c342206eab203aa5a29e035dabb6c2d5de1b62e6d3ec03 not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637129 4790 scope.go:117] "RemoveContainer" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.637503 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2\": container with ID starting with 786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2 not found: ID does not exist" containerID="786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637525 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2"} err="failed to get container status \"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2\": rpc error: code = NotFound desc = could not find container \"786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2\": container with ID starting with 786a1d0c00be91fe9d90e376b01c50d35d69b5dd59fa2c4e54f62f8dba2b9bd2 not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.637537 4790 scope.go:117] "RemoveContainer" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.637983 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3\": container with ID starting with b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3 not found: ID does not exist" containerID="b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.638000 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3"} err="failed to get container status \"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3\": rpc error: code = NotFound desc = could not find container \"b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3\": container with ID starting with b652bcc16560ff341ae546c33cd02c000bbc994f0bf747be9040c747295799c3 not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.638012 4790 scope.go:117] "RemoveContainer" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" Mar 13 20:49:18 crc kubenswrapper[4790]: E0313 20:49:18.647726 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df\": container with ID starting with 926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df not found: ID does not exist" containerID="926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.647760 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df"} err="failed to get container status \"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df\": rpc error: code = NotFound desc = could not find container \"926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df\": container with ID starting with 926226a623882282e16cc34de847e4acb5a3cafff95d6f8a8f7600d23d4047df not found: ID does not exist" Mar 13 20:49:18 crc kubenswrapper[4790]: I0313 20:49:18.941984 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:22 crc kubenswrapper[4790]: I0313 20:49:22.629269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"1d97979fdd68d0748ba8fa4d7f33307ed19c474507126a6137c97c42d6089130"} Mar 13 20:49:23 crc kubenswrapper[4790]: I0313 20:49:23.643850 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerStarted","Data":"6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f"} Mar 13 20:49:23 crc kubenswrapper[4790]: I0313 20:49:23.680751 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-82klj" podStartSLOduration=2.100135928 podStartE2EDuration="15.680726442s" podCreationTimestamp="2026-03-13 20:49:08 +0000 UTC" firstStartedPulling="2026-03-13 20:49:08.978883466 +0000 UTC m=+1279.999999357" lastFinishedPulling="2026-03-13 20:49:22.55947398 +0000 UTC m=+1293.580589871" observedRunningTime="2026-03-13 20:49:23.669094435 +0000 UTC m=+1294.690210326" watchObservedRunningTime="2026-03-13 20:49:23.680726442 +0000 UTC m=+1294.701842333" Mar 13 20:49:23 crc kubenswrapper[4790]: I0313 20:49:23.693057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289"} Mar 13 20:49:24 crc kubenswrapper[4790]: I0313 20:49:24.682280 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280"} Mar 13 20:49:24 crc kubenswrapper[4790]: I0313 20:49:24.682890 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1"} Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerStarted","Data":"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c"} Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.704613 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703463 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" containerID="cri-o://909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703455 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" containerID="cri-o://2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703483 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" containerID="cri-o://407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.703455 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" containerID="cri-o://2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" gracePeriod=30 Mar 13 20:49:26 crc kubenswrapper[4790]: I0313 20:49:26.744322 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=11.213326324 podStartE2EDuration="14.744305009s" podCreationTimestamp="2026-03-13 20:49:12 +0000 UTC" firstStartedPulling="2026-03-13 20:49:22.545593292 +0000 UTC m=+1293.566709183" lastFinishedPulling="2026-03-13 20:49:26.076571977 +0000 UTC m=+1297.097687868" observedRunningTime="2026-03-13 20:49:26.737509653 +0000 UTC m=+1297.758625554" watchObservedRunningTime="2026-03-13 20:49:26.744305009 +0000 UTC m=+1297.765420900" Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718088 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" exitCode=0 Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718441 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" exitCode=2 Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718453 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" exitCode=0 Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718160 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c"} Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280"} Mar 13 20:49:27 crc kubenswrapper[4790]: I0313 20:49:27.718509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1"} Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.684596 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768637 4790 generic.go:334] "Generic (PLEG): container finished" podID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" exitCode=0 Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768682 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289"} Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768705 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e0bf0d8a-16c3-4a20-905d-f08a5906ded3","Type":"ContainerDied","Data":"1d97979fdd68d0748ba8fa4d7f33307ed19c474507126a6137c97c42d6089130"} Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768722 4790 scope.go:117] "RemoveContainer" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.768834 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.789285 4790 scope.go:117] "RemoveContainer" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.812867 4790 scope.go:117] "RemoveContainer" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.835235 4790 scope.go:117] "RemoveContainer" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843579 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843715 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843744 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843796 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843874 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843926 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.843952 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") pod \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\" (UID: \"e0bf0d8a-16c3-4a20-905d-f08a5906ded3\") " Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.844326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.844439 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.849867 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts" (OuterVolumeSpecName: "scripts") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.851051 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8" (OuterVolumeSpecName: "kube-api-access-6ppp8") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "kube-api-access-6ppp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.858853 4790 scope.go:117] "RemoveContainer" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.859751 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c\": container with ID starting with 2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c not found: ID does not exist" containerID="2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.859811 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c"} err="failed to get container status \"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c\": rpc error: code = NotFound desc = could not find container \"2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c\": container with ID starting with 2238144f0093d78194733a03cd653bde3a18d7ec799575edb88f8b5ca9847a1c not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.859845 4790 scope.go:117] "RemoveContainer" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.860327 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280\": container with ID starting with 909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280 not found: ID does not exist" containerID="909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860409 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280"} err="failed to get container status \"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280\": rpc error: code = NotFound desc = could not find container \"909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280\": container with ID starting with 909767672ffe827145ea7844c92344a32db5d147e3594af41e55d99832958280 not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860449 4790 scope.go:117] "RemoveContainer" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.860827 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1\": container with ID starting with 407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1 not found: ID does not exist" containerID="407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860861 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1"} err="failed to get container status \"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1\": rpc error: code = NotFound desc = could not find container \"407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1\": container with ID starting with 407b3797104bba8a2de13d18862b19c3de4c67d532193f5315f42d55b51d47b1 not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.860881 4790 scope.go:117] "RemoveContainer" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" Mar 13 20:49:31 crc kubenswrapper[4790]: E0313 20:49:31.861185 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289\": container with ID starting with 2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289 not found: ID does not exist" containerID="2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.861215 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289"} err="failed to get container status \"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289\": rpc error: code = NotFound desc = could not find container \"2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289\": container with ID starting with 2cd21576ac1ce15c83eaa4250460c254f32b00d838d786eb3e015fb42914e289 not found: ID does not exist" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.886737 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.918702 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.937474 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data" (OuterVolumeSpecName: "config-data") pod "e0bf0d8a-16c3-4a20-905d-f08a5906ded3" (UID: "e0bf0d8a-16c3-4a20-905d-f08a5906ded3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945535 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945578 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945587 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945596 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945605 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945612 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:31 crc kubenswrapper[4790]: I0313 20:49:31.945621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ppp8\" (UniqueName: \"kubernetes.io/projected/e0bf0d8a-16c3-4a20-905d-f08a5906ded3-kube-api-access-6ppp8\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.100820 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.109610 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125138 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125534 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125558 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125670 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125684 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125702 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125709 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: E0313 20:49:32.125726 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125733 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125885 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="sg-core" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125899 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="proxy-httpd" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125919 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-notification-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.125932 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" containerName="ceilometer-central-agent" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.128145 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.131010 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.131018 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.137747 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.250679 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251192 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251329 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251478 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.251907 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.353858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.353908 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354001 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354071 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354087 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354118 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354136 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.354485 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.355195 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.359187 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.359583 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.366442 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.367593 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.372359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"ceilometer-0\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.450824 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:49:32 crc kubenswrapper[4790]: I0313 20:49:32.896251 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:49:32 crc kubenswrapper[4790]: W0313 20:49:32.897174 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe WatchSource:0}: Error finding container d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe: Status 404 returned error can't find the container with id d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.670148 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0bf0d8a-16c3-4a20-905d-f08a5906ded3" path="/var/lib/kubelet/pods/e0bf0d8a-16c3-4a20-905d-f08a5906ded3/volumes" Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.804877 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603"} Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.804948 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe"} Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.808657 4790 generic.go:334] "Generic (PLEG): container finished" podID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerID="6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f" exitCode=0 Mar 13 20:49:33 crc kubenswrapper[4790]: I0313 20:49:33.808753 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerDied","Data":"6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f"} Mar 13 20:49:34 crc kubenswrapper[4790]: I0313 20:49:34.837266 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6"} Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.169401 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.309931 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.310032 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.310223 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.310292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") pod \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\" (UID: \"04b866fe-5d7d-46ab-9074-b93ddc7724f0\") " Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.317549 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts" (OuterVolumeSpecName: "scripts") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.339090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7" (OuterVolumeSpecName: "kube-api-access-4hlv7") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "kube-api-access-4hlv7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.345749 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data" (OuterVolumeSpecName: "config-data") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.349899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "04b866fe-5d7d-46ab-9074-b93ddc7724f0" (UID: "04b866fe-5d7d-46ab-9074-b93ddc7724f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.412927 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hlv7\" (UniqueName: \"kubernetes.io/projected/04b866fe-5d7d-46ab-9074-b93ddc7724f0-kube-api-access-4hlv7\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.413243 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.413298 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.413314 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/04b866fe-5d7d-46ab-9074-b93ddc7724f0-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.849865 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-82klj" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.849857 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-82klj" event={"ID":"04b866fe-5d7d-46ab-9074-b93ddc7724f0","Type":"ContainerDied","Data":"b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff"} Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.851091 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b43dc93316c7d4a150902a2bc087f9dbd70b6a0fb345d27db2100e27d21c97ff" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.853473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922"} Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.933137 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 20:49:35 crc kubenswrapper[4790]: E0313 20:49:35.933797 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerName="nova-cell0-conductor-db-sync" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.933891 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerName="nova-cell0-conductor-db-sync" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.934133 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" containerName="nova-cell0-conductor-db-sync" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.934834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.936928 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.938330 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-2wmdt" Mar 13 20:49:35 crc kubenswrapper[4790]: I0313 20:49:35.946265 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.024297 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.024410 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qf72\" (UniqueName: \"kubernetes.io/projected/253ef3a1-1764-4120-a5f8-db908a0e7fd4-kube-api-access-7qf72\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.024506 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.126082 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.126166 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.126218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qf72\" (UniqueName: \"kubernetes.io/projected/253ef3a1-1764-4120-a5f8-db908a0e7fd4-kube-api-access-7qf72\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.162843 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.162974 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253ef3a1-1764-4120-a5f8-db908a0e7fd4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.166396 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qf72\" (UniqueName: \"kubernetes.io/projected/253ef3a1-1764-4120-a5f8-db908a0e7fd4-kube-api-access-7qf72\") pod \"nova-cell0-conductor-0\" (UID: \"253ef3a1-1764-4120-a5f8-db908a0e7fd4\") " pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.252791 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.732943 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 13 20:49:36 crc kubenswrapper[4790]: W0313 20:49:36.740257 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod253ef3a1_1764_4120_a5f8_db908a0e7fd4.slice/crio-c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a WatchSource:0}: Error finding container c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a: Status 404 returned error can't find the container with id c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a Mar 13 20:49:36 crc kubenswrapper[4790]: I0313 20:49:36.865194 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"253ef3a1-1764-4120-a5f8-db908a0e7fd4","Type":"ContainerStarted","Data":"c3e8cd3cca4cbe31f6adb96fd6c65d496456b81c4cd6e438339154cc6c12d22a"} Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.877761 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerStarted","Data":"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46"} Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.878639 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.882254 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"253ef3a1-1764-4120-a5f8-db908a0e7fd4","Type":"ContainerStarted","Data":"961aa80bee89b00f627b8513e50e0b372633913ae24bf7e58526880debf770ae"} Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.882427 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.915677 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.226079191 podStartE2EDuration="5.915659282s" podCreationTimestamp="2026-03-13 20:49:32 +0000 UTC" firstStartedPulling="2026-03-13 20:49:32.899626073 +0000 UTC m=+1303.920741964" lastFinishedPulling="2026-03-13 20:49:36.589206164 +0000 UTC m=+1307.610322055" observedRunningTime="2026-03-13 20:49:37.903547432 +0000 UTC m=+1308.924663323" watchObservedRunningTime="2026-03-13 20:49:37.915659282 +0000 UTC m=+1308.936775173" Mar 13 20:49:37 crc kubenswrapper[4790]: I0313 20:49:37.926831 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.926810016 podStartE2EDuration="2.926810016s" podCreationTimestamp="2026-03-13 20:49:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:37.923034274 +0000 UTC m=+1308.944150165" watchObservedRunningTime="2026-03-13 20:49:37.926810016 +0000 UTC m=+1308.947925897" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.285914 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.747354 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.748895 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.751633 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.755685 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.757778 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836190 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.836243 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938452 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.938478 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.947754 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.955010 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.957408 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.974729 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"nova-cell0-cell-mapping-gj4j7\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.976415 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.977674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.980218 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 20:49:41 crc kubenswrapper[4790]: I0313 20:49:41.994630 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.073253 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.073476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.077025 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.080278 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.115263 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.164302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.164671 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.164719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.262453 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.264984 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.272802 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301006 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301110 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301185 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301220 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301252 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.301578 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.317498 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.319063 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.326498 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.327760 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.329945 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.330180 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.343981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.349789 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.355964 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.364224 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"nova-scheduler-0\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.401764 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402394 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402419 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402499 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.402555 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.404268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.406689 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.409406 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.431325 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"nova-api-0\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504139 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504194 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504227 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504255 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504273 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504310 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504325 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504425 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504455 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504485 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504502 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504535 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.504559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.506050 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.509668 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.510272 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.524617 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"nova-metadata-0\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.548371 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606554 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606584 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.606954 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.607527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.607526 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.607808 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608125 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608153 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608196 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.608965 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.609604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.611971 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.621208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.623910 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.630543 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"dnsmasq-dns-757b4f8459-jstn6\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.684873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.732834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.756054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.781407 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:49:42 crc kubenswrapper[4790]: W0313 20:49:42.823471 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968 WatchSource:0}: Error finding container 55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968: Status 404 returned error can't find the container with id 55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968 Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.925720 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:42 crc kubenswrapper[4790]: I0313 20:49:42.945746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerStarted","Data":"55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968"} Mar 13 20:49:42 crc kubenswrapper[4790]: W0313 20:49:42.950967 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401 WatchSource:0}: Error finding container 23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401: Status 404 returned error can't find the container with id 23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401 Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.189097 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.287572 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.289034 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.291772 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.292335 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.298890 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.330088 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.433739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.433784 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.433854 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.434124 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: W0313 20:49:43.475559 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7082b53_1345_4c47_a9bf_b87d9e1fd3ca.slice/crio-51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd WatchSource:0}: Error finding container 51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd: Status 404 returned error can't find the container with id 51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.484158 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536638 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536773 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.536948 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.542656 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.542986 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.543459 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.554243 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"nova-cell1-conductor-db-sync-bh2vb\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.589922 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.613318 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.959207 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerStarted","Data":"04571522f47168f89424bb71a3f0416a30a9cbe088abe70c50ac3387dcafbc56"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.961187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerStarted","Data":"670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.962895 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerStarted","Data":"23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.965612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerStarted","Data":"51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.967131 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerStarted","Data":"703a1bf7672ad738d5a0561a4b2308100e00dc344ee885923cb3275bce620370"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.971447 4790 generic.go:334] "Generic (PLEG): container finished" podID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerID="62406a3417f49cd6fee467ec15aafed59672de36ebec3945dba28321522a57f0" exitCode=0 Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.971499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerDied","Data":"62406a3417f49cd6fee467ec15aafed59672de36ebec3945dba28321522a57f0"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.971529 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerStarted","Data":"d410e2281728cc5d35324b5d9753eac6a283696daed20ea1b6c874c3b410e22c"} Mar 13 20:49:43 crc kubenswrapper[4790]: I0313 20:49:43.979486 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gj4j7" podStartSLOduration=2.97946824 podStartE2EDuration="2.97946824s" podCreationTimestamp="2026-03-13 20:49:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:43.976247812 +0000 UTC m=+1314.997363703" watchObservedRunningTime="2026-03-13 20:49:43.97946824 +0000 UTC m=+1315.000584131" Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.118058 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:49:44 crc kubenswrapper[4790]: W0313 20:49:44.968303 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255451e0_9cb8_424f_a327_6e7ef4e4d775.slice/crio-e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c WatchSource:0}: Error finding container e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c: Status 404 returned error can't find the container with id e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.985988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerStarted","Data":"e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c"} Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.990752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerStarted","Data":"31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f"} Mar 13 20:49:44 crc kubenswrapper[4790]: I0313 20:49:44.990811 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:46 crc kubenswrapper[4790]: I0313 20:49:46.024982 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" podStartSLOduration=4.02495418 podStartE2EDuration="4.02495418s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:45.011178099 +0000 UTC m=+1316.032293990" watchObservedRunningTime="2026-03-13 20:49:46.02495418 +0000 UTC m=+1317.046070071" Mar 13 20:49:46 crc kubenswrapper[4790]: I0313 20:49:46.026991 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:49:46 crc kubenswrapper[4790]: I0313 20:49:46.036638 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.015728 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerStarted","Data":"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.019802 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerStarted","Data":"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.019848 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" gracePeriod=30 Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.026464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerStarted","Data":"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.026515 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerStarted","Data":"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.029881 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerStarted","Data":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.029913 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerStarted","Data":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.030033 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" containerID="cri-o://d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" gracePeriod=30 Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.030147 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" containerID="cri-o://31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" gracePeriod=30 Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.044604 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.313984373 podStartE2EDuration="6.04458572s" podCreationTimestamp="2026-03-13 20:49:41 +0000 UTC" firstStartedPulling="2026-03-13 20:49:42.979174097 +0000 UTC m=+1314.000289988" lastFinishedPulling="2026-03-13 20:49:45.709775444 +0000 UTC m=+1316.730891335" observedRunningTime="2026-03-13 20:49:47.034555337 +0000 UTC m=+1318.055671248" watchObservedRunningTime="2026-03-13 20:49:47.04458572 +0000 UTC m=+1318.065701611" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.053942 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerStarted","Data":"d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115"} Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.062752 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.537505361 podStartE2EDuration="5.062729975s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="2026-03-13 20:49:43.185941497 +0000 UTC m=+1314.207057388" lastFinishedPulling="2026-03-13 20:49:45.711166111 +0000 UTC m=+1316.732282002" observedRunningTime="2026-03-13 20:49:47.060130524 +0000 UTC m=+1318.081246415" watchObservedRunningTime="2026-03-13 20:49:47.062729975 +0000 UTC m=+1318.083845866" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.088639 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.856090979 podStartE2EDuration="5.088614291s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="2026-03-13 20:49:43.47791968 +0000 UTC m=+1314.499035571" lastFinishedPulling="2026-03-13 20:49:45.710442982 +0000 UTC m=+1316.731558883" observedRunningTime="2026-03-13 20:49:47.0761542 +0000 UTC m=+1318.097270091" watchObservedRunningTime="2026-03-13 20:49:47.088614291 +0000 UTC m=+1318.109730182" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.105028 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.7307323500000003 podStartE2EDuration="5.105010268s" podCreationTimestamp="2026-03-13 20:49:42 +0000 UTC" firstStartedPulling="2026-03-13 20:49:43.335629569 +0000 UTC m=+1314.356745460" lastFinishedPulling="2026-03-13 20:49:45.709907497 +0000 UTC m=+1316.731023378" observedRunningTime="2026-03-13 20:49:47.092326722 +0000 UTC m=+1318.113442643" watchObservedRunningTime="2026-03-13 20:49:47.105010268 +0000 UTC m=+1318.126126159" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.117818 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" podStartSLOduration=4.117798036 podStartE2EDuration="4.117798036s" podCreationTimestamp="2026-03-13 20:49:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:47.113221952 +0000 UTC m=+1318.134337843" watchObservedRunningTime="2026-03-13 20:49:47.117798036 +0000 UTC m=+1318.138913927" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.402451 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.641681 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740359 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740806 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740850 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.740945 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") pod \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\" (UID: \"e2fd6d31-1072-47b0-aa6b-327fac52a13b\") " Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.741951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs" (OuterVolumeSpecName: "logs") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.758773 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm" (OuterVolumeSpecName: "kube-api-access-gm7sm") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "kube-api-access-gm7sm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.772573 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.779691 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data" (OuterVolumeSpecName: "config-data") pod "e2fd6d31-1072-47b0-aa6b-327fac52a13b" (UID: "e2fd6d31-1072-47b0-aa6b-327fac52a13b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.794669 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843790 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm7sm\" (UniqueName: \"kubernetes.io/projected/e2fd6d31-1072-47b0-aa6b-327fac52a13b-kube-api-access-gm7sm\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843864 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843878 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2fd6d31-1072-47b0-aa6b-327fac52a13b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:47 crc kubenswrapper[4790]: I0313 20:49:47.843887 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2fd6d31-1072-47b0-aa6b-327fac52a13b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.069997 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" exitCode=0 Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070038 4790 generic.go:334] "Generic (PLEG): container finished" podID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" exitCode=143 Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070111 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerDied","Data":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070226 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerDied","Data":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070261 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e2fd6d31-1072-47b0-aa6b-327fac52a13b","Type":"ContainerDied","Data":"04571522f47168f89424bb71a3f0416a30a9cbe088abe70c50ac3387dcafbc56"} Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.070280 4790 scope.go:117] "RemoveContainer" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.118293 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.124007 4790 scope.go:117] "RemoveContainer" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.127370 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143185 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.143628 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143648 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.143663 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143671 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143912 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-metadata" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.143941 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" containerName="nova-metadata-log" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.145004 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.147286 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.147479 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.173050 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.222802 4790 scope.go:117] "RemoveContainer" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.223239 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": container with ID starting with 31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1 not found: ID does not exist" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223288 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} err="failed to get container status \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": rpc error: code = NotFound desc = could not find container \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": container with ID starting with 31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1 not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223308 4790 scope.go:117] "RemoveContainer" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: E0313 20:49:48.223787 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": container with ID starting with d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe not found: ID does not exist" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223815 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} err="failed to get container status \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": rpc error: code = NotFound desc = could not find container \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": container with ID starting with d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.223833 4790 scope.go:117] "RemoveContainer" containerID="31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.224102 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1"} err="failed to get container status \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": rpc error: code = NotFound desc = could not find container \"31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1\": container with ID starting with 31ed26bd0fc44b450359c97106de8985c64105f660a17f6d24771017f30575d1 not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.224128 4790 scope.go:117] "RemoveContainer" containerID="d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.224434 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe"} err="failed to get container status \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": rpc error: code = NotFound desc = could not find container \"d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe\": container with ID starting with d80d3cf5a8cc4b6394bb3bd221e198f0fa22e23e7b03f927e1f4eda117f19bfe not found: ID does not exist" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254120 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254182 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254204 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.254337 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356149 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356350 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356519 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.356824 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.357060 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.363080 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.372946 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.372973 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.380151 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"nova-metadata-0\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " pod="openstack/nova-metadata-0" Mar 13 20:49:48 crc kubenswrapper[4790]: I0313 20:49:48.524726 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:49 crc kubenswrapper[4790]: W0313 20:49:49.051367 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419 WatchSource:0}: Error finding container a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419: Status 404 returned error can't find the container with id a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419 Mar 13 20:49:49 crc kubenswrapper[4790]: I0313 20:49:49.055968 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:49 crc kubenswrapper[4790]: I0313 20:49:49.081063 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerStarted","Data":"a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419"} Mar 13 20:49:49 crc kubenswrapper[4790]: I0313 20:49:49.684091 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2fd6d31-1072-47b0-aa6b-327fac52a13b" path="/var/lib/kubelet/pods/e2fd6d31-1072-47b0-aa6b-327fac52a13b/volumes" Mar 13 20:49:50 crc kubenswrapper[4790]: I0313 20:49:50.093148 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerStarted","Data":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} Mar 13 20:49:50 crc kubenswrapper[4790]: I0313 20:49:50.094477 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerStarted","Data":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} Mar 13 20:49:50 crc kubenswrapper[4790]: I0313 20:49:50.127160 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.127125124 podStartE2EDuration="2.127125124s" podCreationTimestamp="2026-03-13 20:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:50.113087241 +0000 UTC m=+1321.134203152" watchObservedRunningTime="2026-03-13 20:49:50.127125124 +0000 UTC m=+1321.148241055" Mar 13 20:49:51 crc kubenswrapper[4790]: I0313 20:49:51.107029 4790 generic.go:334] "Generic (PLEG): container finished" podID="e71d98c3-e247-448e-945e-016a6755c689" containerID="670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc" exitCode=0 Mar 13 20:49:51 crc kubenswrapper[4790]: I0313 20:49:51.108467 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerDied","Data":"670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc"} Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.402594 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.432082 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.478189 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.550236 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.551193 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.634948 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") pod \"e71d98c3-e247-448e-945e-016a6755c689\" (UID: \"e71d98c3-e247-448e-945e-016a6755c689\") " Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.640350 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts" (OuterVolumeSpecName: "scripts") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.640699 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945" (OuterVolumeSpecName: "kube-api-access-mc945") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "kube-api-access-mc945". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.666821 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data" (OuterVolumeSpecName: "config-data") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.666995 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e71d98c3-e247-448e-945e-016a6755c689" (UID: "e71d98c3-e247-448e-945e-016a6755c689"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.734566 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738621 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738647 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc945\" (UniqueName: \"kubernetes.io/projected/e71d98c3-e247-448e-945e-016a6755c689-kube-api-access-mc945\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738694 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.738702 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e71d98c3-e247-448e-945e-016a6755c689-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.787010 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:49:52 crc kubenswrapper[4790]: I0313 20:49:52.787297 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" containerID="cri-o://716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb" gracePeriod=10 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.134788 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gj4j7" event={"ID":"e71d98c3-e247-448e-945e-016a6755c689","Type":"ContainerDied","Data":"55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968"} Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.134834 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.134912 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gj4j7" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.140939 4790 generic.go:334] "Generic (PLEG): container finished" podID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerID="d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115" exitCode=0 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.141699 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerDied","Data":"d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115"} Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.161562 4790 generic.go:334] "Generic (PLEG): container finished" podID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerID="716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb" exitCode=0 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.161830 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerDied","Data":"716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb"} Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.194465 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.235298 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.319707 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.359921 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360039 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360118 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360264 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.360311 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") pod \"5348982d-ffd4-4226-8c69-1984dc02ffbe\" (UID: \"5348982d-ffd4-4226-8c69-1984dc02ffbe\") " Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.361075 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.362130 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" containerID="cri-o://589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" gracePeriod=30 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.362722 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" containerID="cri-o://0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" gracePeriod=30 Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.380831 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd" (OuterVolumeSpecName: "kube-api-access-5spxd") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "kube-api-access-5spxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.452518 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config" (OuterVolumeSpecName: "config") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.459366 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.461694 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462309 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462403 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5spxd\" (UniqueName: \"kubernetes.io/projected/5348982d-ffd4-4226-8c69-1984dc02ffbe-kube-api-access-5spxd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462478 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.462540 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.469862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.473872 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5348982d-ffd4-4226-8c69-1984dc02ffbe" (UID: "5348982d-ffd4-4226-8c69-1984dc02ffbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.564045 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.564319 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5348982d-ffd4-4226-8c69-1984dc02ffbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.634574 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.634596 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.773955 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:53 crc kubenswrapper[4790]: I0313 20:49:53.955337 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073652 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073825 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073898 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.073971 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.074122 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") pod \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\" (UID: \"37e33a9e-1def-49b1-b1a7-81be1f5e72ee\") " Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.074185 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs" (OuterVolumeSpecName: "logs") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.075230 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.080111 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp" (OuterVolumeSpecName: "kube-api-access-qn2wp") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "kube-api-access-qn2wp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.101104 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.109558 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data" (OuterVolumeSpecName: "config-data") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.126219 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "37e33a9e-1def-49b1-b1a7-81be1f5e72ee" (UID: "37e33a9e-1def-49b1-b1a7-81be1f5e72ee"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.176939 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qn2wp\" (UniqueName: \"kubernetes.io/projected/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-kube-api-access-qn2wp\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.177266 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.177353 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.177449 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/37e33a9e-1def-49b1-b1a7-81be1f5e72ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.184172 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.184101 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c9776ccc5-5hxds" event={"ID":"5348982d-ffd4-4226-8c69-1984dc02ffbe","Type":"ContainerDied","Data":"90cf908fda5bfa83deaae1fd0eac95ba601f9eb9da62b0fab2c3af0677ac98b2"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.184637 4790 scope.go:117] "RemoveContainer" containerID="716981bb1f84db71c9f5dc98afe338733b9e6edcfcc60b6262bd07cda695e5bb" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188325 4790 generic.go:334] "Generic (PLEG): container finished" podID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" exitCode=0 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188353 4790 generic.go:334] "Generic (PLEG): container finished" podID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" exitCode=143 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188372 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerDied","Data":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188414 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188446 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerDied","Data":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.188464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"37e33a9e-1def-49b1-b1a7-81be1f5e72ee","Type":"ContainerDied","Data":"a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419"} Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.189114 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" containerID="cri-o://f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" gracePeriod=30 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.189506 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" containerID="cri-o://1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" gracePeriod=30 Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.227450 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.236110 4790 scope.go:117] "RemoveContainer" containerID="54996574df6debfb6f3430b43b232f15654c266d463f051ee19ed34e62244f6c" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.242784 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c9776ccc5-5hxds"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.261714 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.277078 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.290519 4790 scope.go:117] "RemoveContainer" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299429 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299890 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299909 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299929 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299936 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299951 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299957 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299969 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="init" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.299977 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="init" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.299997 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71d98c3-e247-448e-945e-016a6755c689" containerName="nova-manage" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300003 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71d98c3-e247-448e-945e-016a6755c689" containerName="nova-manage" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300172 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71d98c3-e247-448e-945e-016a6755c689" containerName="nova-manage" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300184 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-metadata" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300242 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" containerName="nova-metadata-log" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.300258 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" containerName="dnsmasq-dns" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.301335 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.303927 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.305123 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.308638 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.340930 4790 scope.go:117] "RemoveContainer" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.391654 4790 scope.go:117] "RemoveContainer" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.394524 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": container with ID starting with 0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01 not found: ID does not exist" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.394572 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} err="failed to get container status \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": rpc error: code = NotFound desc = could not find container \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": container with ID starting with 0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.394598 4790 scope.go:117] "RemoveContainer" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: E0313 20:49:54.398913 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": container with ID starting with 589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161 not found: ID does not exist" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.398972 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} err="failed to get container status \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": rpc error: code = NotFound desc = could not find container \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": container with ID starting with 589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.399006 4790 scope.go:117] "RemoveContainer" containerID="0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.399841 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01"} err="failed to get container status \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": rpc error: code = NotFound desc = could not find container \"0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01\": container with ID starting with 0d79fe4423cce6dcf744f3eaab4c3b1d5f2d7a5b1906dad698b8f2aaa2442c01 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.399863 4790 scope.go:117] "RemoveContainer" containerID="589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.400131 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161"} err="failed to get container status \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": rpc error: code = NotFound desc = could not find container \"589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161\": container with ID starting with 589cb3af995e67ab03c81bba752337927de2dcda6c145c9cea9df4bee6073161 not found: ID does not exist" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486036 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486661 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.486811 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732219 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732343 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.732481 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.734846 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.739348 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.740097 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.744658 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.758811 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"nova-metadata-0\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " pod="openstack/nova-metadata-0" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.846232 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:54 crc kubenswrapper[4790]: I0313 20:49:54.955496 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.060912 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.061292 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.061495 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.061552 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") pod \"255451e0-9cb8-424f-a327-6e7ef4e4d775\" (UID: \"255451e0-9cb8-424f-a327-6e7ef4e4d775\") " Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.071590 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts" (OuterVolumeSpecName: "scripts") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.071771 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd" (OuterVolumeSpecName: "kube-api-access-rlmqd") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "kube-api-access-rlmqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.100787 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.112127 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data" (OuterVolumeSpecName: "config-data") pod "255451e0-9cb8-424f-a327-6e7ef4e4d775" (UID: "255451e0-9cb8-424f-a327-6e7ef4e4d775"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163896 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163946 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlmqd\" (UniqueName: \"kubernetes.io/projected/255451e0-9cb8-424f-a327-6e7ef4e4d775-kube-api-access-rlmqd\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163963 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.163975 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/255451e0-9cb8-424f-a327-6e7ef4e4d775-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.266071 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 20:49:55 crc kubenswrapper[4790]: E0313 20:49:55.266465 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerName="nova-cell1-conductor-db-sync" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.266479 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerName="nova-cell1-conductor-db-sync" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.266727 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" containerName="nova-cell1-conductor-db-sync" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.267356 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.271961 4790 generic.go:334] "Generic (PLEG): container finished" podID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" exitCode=143 Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.272091 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerDied","Data":"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263"} Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.282225 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" containerID="cri-o://bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" gracePeriod=30 Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.282627 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.284972 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-bh2vb" event={"ID":"255451e0-9cb8-424f-a327-6e7ef4e4d775","Type":"ContainerDied","Data":"e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c"} Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.285007 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.285023 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.368069 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.368431 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p95db\" (UniqueName: \"kubernetes.io/projected/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-kube-api-access-p95db\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.368516 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.456813 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.489491 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.490471 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p95db\" (UniqueName: \"kubernetes.io/projected/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-kube-api-access-p95db\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.490626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.501346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.502223 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.529284 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p95db\" (UniqueName: \"kubernetes.io/projected/0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b-kube-api-access-p95db\") pod \"nova-cell1-conductor-0\" (UID: \"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b\") " pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.595800 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.688394 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="37e33a9e-1def-49b1-b1a7-81be1f5e72ee" path="/var/lib/kubelet/pods/37e33a9e-1def-49b1-b1a7-81be1f5e72ee/volumes" Mar 13 20:49:55 crc kubenswrapper[4790]: I0313 20:49:55.689559 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5348982d-ffd4-4226-8c69-1984dc02ffbe" path="/var/lib/kubelet/pods/5348982d-ffd4-4226-8c69-1984dc02ffbe/volumes" Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.014757 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.293401 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerStarted","Data":"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.293451 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerStarted","Data":"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.293463 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerStarted","Data":"1998e9c0ed2f49c51df1fb979275385f3c3c928b8ffbba368fee9881d45e3a34"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.295159 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b","Type":"ContainerStarted","Data":"090e00a03545aa608baa22ecbda44610983cec4d0dc2ac4b17f3618770499479"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.295184 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b","Type":"ContainerStarted","Data":"36392374d77fd75658a5cb748dd738c66a6b5e5859fa85fa2b6ce243e537f157"} Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.295331 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.314276 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.314255326 podStartE2EDuration="2.314255326s" podCreationTimestamp="2026-03-13 20:49:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:56.308241482 +0000 UTC m=+1327.329357373" watchObservedRunningTime="2026-03-13 20:49:56.314255326 +0000 UTC m=+1327.335371217" Mar 13 20:49:56 crc kubenswrapper[4790]: I0313 20:49:56.337253 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.337231612 podStartE2EDuration="1.337231612s" podCreationTimestamp="2026-03-13 20:49:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:49:56.328229667 +0000 UTC m=+1327.349345558" watchObservedRunningTime="2026-03-13 20:49:56.337231612 +0000 UTC m=+1327.358347503" Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.404635 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.407706 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.409893 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:49:57 crc kubenswrapper[4790]: E0313 20:49:57.409938 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.084814 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.154199 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") pod \"09a61a2b-7821-476f-af33-74837a0e2026\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.154295 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") pod \"09a61a2b-7821-476f-af33-74837a0e2026\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.154370 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") pod \"09a61a2b-7821-476f-af33-74837a0e2026\" (UID: \"09a61a2b-7821-476f-af33-74837a0e2026\") " Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.179481 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct" (OuterVolumeSpecName: "kube-api-access-85bct") pod "09a61a2b-7821-476f-af33-74837a0e2026" (UID: "09a61a2b-7821-476f-af33-74837a0e2026"). InnerVolumeSpecName "kube-api-access-85bct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.186538 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data" (OuterVolumeSpecName: "config-data") pod "09a61a2b-7821-476f-af33-74837a0e2026" (UID: "09a61a2b-7821-476f-af33-74837a0e2026"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.196680 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09a61a2b-7821-476f-af33-74837a0e2026" (UID: "09a61a2b-7821-476f-af33-74837a0e2026"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.257890 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.258297 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85bct\" (UniqueName: \"kubernetes.io/projected/09a61a2b-7821-476f-af33-74837a0e2026-kube-api-access-85bct\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.258363 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09a61a2b-7821-476f-af33-74837a0e2026-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323000 4790 generic.go:334] "Generic (PLEG): container finished" podID="09a61a2b-7821-476f-af33-74837a0e2026" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" exitCode=0 Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323082 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerDied","Data":"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50"} Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323452 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"09a61a2b-7821-476f-af33-74837a0e2026","Type":"ContainerDied","Data":"23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401"} Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.323479 4790 scope.go:117] "RemoveContainer" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.349009 4790 scope.go:117] "RemoveContainer" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" Mar 13 20:49:58 crc kubenswrapper[4790]: E0313 20:49:58.349633 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50\": container with ID starting with bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50 not found: ID does not exist" containerID="bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.349674 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50"} err="failed to get container status \"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50\": rpc error: code = NotFound desc = could not find container \"bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50\": container with ID starting with bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50 not found: ID does not exist" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.362747 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.373836 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.385572 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: E0313 20:49:58.386305 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.386584 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.389042 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09a61a2b-7821-476f-af33-74837a0e2026" containerName="nova-scheduler-scheduler" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.390164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.395262 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.397715 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.462719 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.463272 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.463345 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.565410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.565624 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.565661 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.569551 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.569602 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.581703 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"nova-scheduler-0\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " pod="openstack/nova-scheduler-0" Mar 13 20:49:58 crc kubenswrapper[4790]: I0313 20:49:58.715702 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.200550 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.245165 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.275998 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.276088 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.276126 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.276291 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") pod \"24025591-dced-41d1-bd6d-e8784c0caa3b\" (UID: \"24025591-dced-41d1-bd6d-e8784c0caa3b\") " Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.277312 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs" (OuterVolumeSpecName: "logs") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.280581 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm" (OuterVolumeSpecName: "kube-api-access-h6fnm") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "kube-api-access-h6fnm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.309899 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data" (OuterVolumeSpecName: "config-data") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.312505 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24025591-dced-41d1-bd6d-e8784c0caa3b" (UID: "24025591-dced-41d1-bd6d-e8784c0caa3b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337165 4790 generic.go:334] "Generic (PLEG): container finished" podID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" exitCode=0 Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337235 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerDied","Data":"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d"} Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337278 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"24025591-dced-41d1-bd6d-e8784c0caa3b","Type":"ContainerDied","Data":"703a1bf7672ad738d5a0561a4b2308100e00dc344ee885923cb3275bce620370"} Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.337336 4790 scope.go:117] "RemoveContainer" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.339813 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerStarted","Data":"6e7ed25e629647fdcbcdadd57d668b2cdfbe95f3b450a5732f265251e324ddb9"} Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.372212 4790 scope.go:117] "RemoveContainer" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380372 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/24025591-dced-41d1-bd6d-e8784c0caa3b-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380421 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6fnm\" (UniqueName: \"kubernetes.io/projected/24025591-dced-41d1-bd6d-e8784c0caa3b-kube-api-access-h6fnm\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380445 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.380457 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24025591-dced-41d1-bd6d-e8784c0caa3b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.385156 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.397642 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412192 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.412618 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412635 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.412648 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412655 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412835 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-api" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.412852 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" containerName="nova-api-log" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.413800 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.418726 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.420396 4790 scope.go:117] "RemoveContainer" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.423435 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d\": container with ID starting with 1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d not found: ID does not exist" containerID="1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.423488 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d"} err="failed to get container status \"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d\": rpc error: code = NotFound desc = could not find container \"1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d\": container with ID starting with 1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d not found: ID does not exist" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.423522 4790 scope.go:117] "RemoveContainer" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" Mar 13 20:49:59 crc kubenswrapper[4790]: E0313 20:49:59.425765 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263\": container with ID starting with f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263 not found: ID does not exist" containerID="f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.425792 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263"} err="failed to get container status \"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263\": rpc error: code = NotFound desc = could not find container \"f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263\": container with ID starting with f8bf9d3a68eb10352238475591a1cf68c37d036a728c12868f74bc3be9fa6263 not found: ID does not exist" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.437519 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.481917 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.481994 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.482040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.482098 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583114 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583192 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583233 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583269 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.583947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.588039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.588977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.598552 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"nova-api-0\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " pod="openstack/nova-api-0" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.670155 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09a61a2b-7821-476f-af33-74837a0e2026" path="/var/lib/kubelet/pods/09a61a2b-7821-476f-af33-74837a0e2026/volumes" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.670843 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24025591-dced-41d1-bd6d-e8784c0caa3b" path="/var/lib/kubelet/pods/24025591-dced-41d1-bd6d-e8784c0caa3b/volumes" Mar 13 20:49:59 crc kubenswrapper[4790]: I0313 20:49:59.749125 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.132851 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.134674 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.137830 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.137945 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.138187 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.146114 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.183593 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.195312 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"auto-csr-approver-29557250-wqt56\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.303073 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"auto-csr-approver-29557250-wqt56\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.323981 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"auto-csr-approver-29557250-wqt56\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.352293 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerStarted","Data":"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e"} Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.354539 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerStarted","Data":"d40e3fdf8db9cbcc4affa484642d64cee75cff31f6fe4e94fb4c91f6efd99014"} Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.381945 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.381917209 podStartE2EDuration="2.381917209s" podCreationTimestamp="2026-03-13 20:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:00.377013735 +0000 UTC m=+1331.398129636" watchObservedRunningTime="2026-03-13 20:50:00.381917209 +0000 UTC m=+1331.403033110" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.453036 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:00 crc kubenswrapper[4790]: I0313 20:50:00.904333 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.368405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557250-wqt56" event={"ID":"d00a5fd8-e634-4969-90ad-6850179e7de1","Type":"ContainerStarted","Data":"cb2b864f403d942403e811cc253d4d1b44763bf8f61a8cd36937bc69bd77a8eb"} Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.371215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerStarted","Data":"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525"} Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.371288 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerStarted","Data":"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f"} Mar 13 20:50:01 crc kubenswrapper[4790]: I0313 20:50:01.398963 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.398944448 podStartE2EDuration="2.398944448s" podCreationTimestamp="2026-03-13 20:49:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:01.393132389 +0000 UTC m=+1332.414248290" watchObservedRunningTime="2026-03-13 20:50:01.398944448 +0000 UTC m=+1332.420060339" Mar 13 20:50:02 crc kubenswrapper[4790]: I0313 20:50:02.466745 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 20:50:03 crc kubenswrapper[4790]: I0313 20:50:03.395710 4790 generic.go:334] "Generic (PLEG): container finished" podID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerID="a46a82afe76ba100b2ac912d7fb0a03ce75de0a957f3543d9259571fea13e90c" exitCode=0 Mar 13 20:50:03 crc kubenswrapper[4790]: I0313 20:50:03.395786 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557250-wqt56" event={"ID":"d00a5fd8-e634-4969-90ad-6850179e7de1","Type":"ContainerDied","Data":"a46a82afe76ba100b2ac912d7fb0a03ce75de0a957f3543d9259571fea13e90c"} Mar 13 20:50:03 crc kubenswrapper[4790]: I0313 20:50:03.716044 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.810518 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.911196 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") pod \"d00a5fd8-e634-4969-90ad-6850179e7de1\" (UID: \"d00a5fd8-e634-4969-90ad-6850179e7de1\") " Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.918641 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl" (OuterVolumeSpecName: "kube-api-access-26zzl") pod "d00a5fd8-e634-4969-90ad-6850179e7de1" (UID: "d00a5fd8-e634-4969-90ad-6850179e7de1"). InnerVolumeSpecName "kube-api-access-26zzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.955715 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:04 crc kubenswrapper[4790]: I0313 20:50:04.955764 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.014083 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26zzl\" (UniqueName: \"kubernetes.io/projected/d00a5fd8-e634-4969-90ad-6850179e7de1-kube-api-access-26zzl\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.414534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557250-wqt56" event={"ID":"d00a5fd8-e634-4969-90ad-6850179e7de1","Type":"ContainerDied","Data":"cb2b864f403d942403e811cc253d4d1b44763bf8f61a8cd36937bc69bd77a8eb"} Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.414578 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb2b864f403d942403e811cc253d4d1b44763bf8f61a8cd36937bc69bd77a8eb" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.414593 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557250-wqt56" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.626590 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.885616 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.894672 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557244-sndr9"] Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.956694 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.956915 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" containerID="cri-o://7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8" gracePeriod=30 Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.971629 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:05 crc kubenswrapper[4790]: I0313 20:50:05.971656 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.198:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.425704 4790 generic.go:334] "Generic (PLEG): container finished" podID="b4696d4e-6124-4bcc-b257-651108f6b837" containerID="7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8" exitCode=2 Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.425799 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerDied","Data":"7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8"} Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.426069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"b4696d4e-6124-4bcc-b257-651108f6b837","Type":"ContainerDied","Data":"a3ba4dde9b3affbf2de80fd01b6004ec5bcc39b41c69eac7056b983bf5ce8c10"} Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.426087 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3ba4dde9b3affbf2de80fd01b6004ec5bcc39b41c69eac7056b983bf5ce8c10" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.448939 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.541833 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") pod \"b4696d4e-6124-4bcc-b257-651108f6b837\" (UID: \"b4696d4e-6124-4bcc-b257-651108f6b837\") " Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.550321 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r" (OuterVolumeSpecName: "kube-api-access-6cc5r") pod "b4696d4e-6124-4bcc-b257-651108f6b837" (UID: "b4696d4e-6124-4bcc-b257-651108f6b837"). InnerVolumeSpecName "kube-api-access-6cc5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:06 crc kubenswrapper[4790]: I0313 20:50:06.645415 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6cc5r\" (UniqueName: \"kubernetes.io/projected/b4696d4e-6124-4bcc-b257-651108f6b837-kube-api-access-6cc5r\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.435207 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.467972 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.479547 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.489619 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: E0313 20:50:07.489998 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerName="oc" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490016 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerName="oc" Mar 13 20:50:07 crc kubenswrapper[4790]: E0313 20:50:07.490029 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490035 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490224 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" containerName="oc" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490243 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" containerName="kube-state-metrics" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.490803 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.493917 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.494279 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.510819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.560669 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.560744 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.560798 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsd4l\" (UniqueName: \"kubernetes.io/projected/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-api-access-jsd4l\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.561016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662320 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662664 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662772 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.662878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsd4l\" (UniqueName: \"kubernetes.io/projected/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-api-access-jsd4l\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.666860 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.668412 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.671329 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f42b93e-6de8-423c-a2d5-dd57885de32c" path="/var/lib/kubelet/pods/7f42b93e-6de8-423c-a2d5-dd57885de32c/volumes" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.672343 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4696d4e-6124-4bcc-b257-651108f6b837" path="/var/lib/kubelet/pods/b4696d4e-6124-4bcc-b257-651108f6b837/volumes" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.674324 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.680599 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsd4l\" (UniqueName: \"kubernetes.io/projected/2ae1ef11-086d-4d65-bfcb-987f3973fdc5-kube-api-access-jsd4l\") pod \"kube-state-metrics-0\" (UID: \"2ae1ef11-086d-4d65-bfcb-987f3973fdc5\") " pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.805932 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.864562 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866326 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" containerID="cri-o://f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" gracePeriod=30 Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866838 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" containerID="cri-o://986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" gracePeriod=30 Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866905 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" containerID="cri-o://8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" gracePeriod=30 Mar 13 20:50:07 crc kubenswrapper[4790]: I0313 20:50:07.866949 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" containerID="cri-o://dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" gracePeriod=30 Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.437805 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452026 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" exitCode=0 Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452061 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" exitCode=2 Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46"} Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.452103 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922"} Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.716752 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 20:50:08 crc kubenswrapper[4790]: I0313 20:50:08.742626 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.462229 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ae1ef11-086d-4d65-bfcb-987f3973fdc5","Type":"ContainerStarted","Data":"83cf2b18dc8eded2ca99b94c1507041268bc933f2620be8caa37cd148556a6c4"} Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.462557 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.462571 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2ae1ef11-086d-4d65-bfcb-987f3973fdc5","Type":"ContainerStarted","Data":"ba0a53291f6b5fed52f31c4c8977792a4c7df319c61a49ca605419e58bc73c3a"} Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.465473 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" exitCode=0 Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.465542 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603"} Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.489863 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.128949849 podStartE2EDuration="2.489842682s" podCreationTimestamp="2026-03-13 20:50:07 +0000 UTC" firstStartedPulling="2026-03-13 20:50:08.440927274 +0000 UTC m=+1339.462043165" lastFinishedPulling="2026-03-13 20:50:08.801820107 +0000 UTC m=+1339.822935998" observedRunningTime="2026-03-13 20:50:09.484182898 +0000 UTC m=+1340.505298789" watchObservedRunningTime="2026-03-13 20:50:09.489842682 +0000 UTC m=+1340.510958573" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.505730 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.750617 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:09 crc kubenswrapper[4790]: I0313 20:50:09.750672 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:10 crc kubenswrapper[4790]: I0313 20:50:10.832543 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:10 crc kubenswrapper[4790]: I0313 20:50:10.832689 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.201:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.475237 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483476 4790 generic.go:334] "Generic (PLEG): container finished" podID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" exitCode=0 Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483538 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6"} Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31f1d628-34fa-4e75-8aa8-f3e724839ee8","Type":"ContainerDied","Data":"d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe"} Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483574 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.483667 4790 scope.go:117] "RemoveContainer" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.507025 4790 scope.go:117] "RemoveContainer" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.528066 4790 scope.go:117] "RemoveContainer" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.552521 4790 scope.go:117] "RemoveContainer" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.562899 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.562967 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.562992 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563076 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563156 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563203 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.563272 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") pod \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\" (UID: \"31f1d628-34fa-4e75-8aa8-f3e724839ee8\") " Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.564172 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.567697 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.573784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth" (OuterVolumeSpecName: "kube-api-access-44qth") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "kube-api-access-44qth". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.582545 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts" (OuterVolumeSpecName: "scripts") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.613974 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.639596 4790 scope.go:117] "RemoveContainer" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.660802 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46\": container with ID starting with 986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46 not found: ID does not exist" containerID="986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.660854 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46"} err="failed to get container status \"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46\": rpc error: code = NotFound desc = could not find container \"986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46\": container with ID starting with 986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.660879 4790 scope.go:117] "RemoveContainer" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667804 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667841 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31f1d628-34fa-4e75-8aa8-f3e724839ee8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667849 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667859 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44qth\" (UniqueName: \"kubernetes.io/projected/31f1d628-34fa-4e75-8aa8-f3e724839ee8-kube-api-access-44qth\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.667868 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.668869 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922\": container with ID starting with 8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922 not found: ID does not exist" containerID="8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.668910 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922"} err="failed to get container status \"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922\": rpc error: code = NotFound desc = could not find container \"8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922\": container with ID starting with 8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.668935 4790 scope.go:117] "RemoveContainer" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.669204 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6\": container with ID starting with dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6 not found: ID does not exist" containerID="dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.669227 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6"} err="failed to get container status \"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6\": rpc error: code = NotFound desc = could not find container \"dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6\": container with ID starting with dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.669241 4790 scope.go:117] "RemoveContainer" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.669443 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603\": container with ID starting with f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603 not found: ID does not exist" containerID="f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.669461 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603"} err="failed to get container status \"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603\": rpc error: code = NotFound desc = could not find container \"f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603\": container with ID starting with f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603 not found: ID does not exist" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.727571 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.765758 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data" (OuterVolumeSpecName: "config-data") pod "31f1d628-34fa-4e75-8aa8-f3e724839ee8" (UID: "31f1d628-34fa-4e75-8aa8-f3e724839ee8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.770233 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.770274 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31f1d628-34fa-4e75-8aa8-f3e724839ee8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.824307 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.844630 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.853586 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854145 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854171 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854192 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854200 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854221 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854229 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" Mar 13 20:50:11 crc kubenswrapper[4790]: E0313 20:50:11.854258 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854266 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854521 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="proxy-httpd" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854547 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-central-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854568 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="ceilometer-notification-agent" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.854582 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" containerName="sg-core" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.856599 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.860979 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.861301 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.869024 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.880755 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.974832 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.974992 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975113 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975208 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975253 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975293 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975354 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:11 crc kubenswrapper[4790]: I0313 20:50:11.975467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077435 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077612 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077705 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077791 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077870 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.077959 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.078067 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.078150 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.078355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082355 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082673 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.082778 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.083087 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.101295 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"ceilometer-0\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.174905 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.675771 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.956450 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:50:12 crc kubenswrapper[4790]: I0313 20:50:12.956625 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:50:13 crc kubenswrapper[4790]: I0313 20:50:13.515492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7"} Mar 13 20:50:13 crc kubenswrapper[4790]: I0313 20:50:13.516074 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"437f52cb36b41910903372ec5bcd7008b8c1ad39f31664517f1ae136ab440e48"} Mar 13 20:50:13 crc kubenswrapper[4790]: I0313 20:50:13.670247 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31f1d628-34fa-4e75-8aa8-f3e724839ee8" path="/var/lib/kubelet/pods/31f1d628-34fa-4e75-8aa8-f3e724839ee8/volumes" Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.526905 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e"} Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.961519 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.965843 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:50:14 crc kubenswrapper[4790]: I0313 20:50:14.966178 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:50:15 crc kubenswrapper[4790]: I0313 20:50:15.537069 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688"} Mar 13 20:50:15 crc kubenswrapper[4790]: I0313 20:50:15.551343 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:50:17 crc kubenswrapper[4790]: W0313 20:50:17.125251 4790 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd00a5fd8_e634_4969_90ad_6850179e7de1.slice": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd00a5fd8_e634_4969_90ad_6850179e7de1.slice: no such file or directory Mar 13 20:50:17 crc kubenswrapper[4790]: E0313 20:50:17.342861 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-conmon-bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice/crio-conmon-1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice/crio-1b14a7e9ab84369e8f3ebfcaf4e51a705fd1574e6c54112baaa4abc4359c593d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4696d4e_6124_4bcc_b257_651108f6b837.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24025591_dced_41d1_bd6d_e8784c0caa3b.slice/crio-703a1bf7672ad738d5a0561a4b2308100e00dc344ee885923cb3275bce620370\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-d9418e81a4860ada9d69bb0521f12bde0b12309263c1efb3c1ac8e85db41aebe\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-bb4a1f46091efe866455e4168536b535cef0aca552c650ab609ddb4288360b50.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-986317f71fc48d93eed0a6d4117b0c768d191fa703fab44f48cc32ca7c94cb46.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255451e0_9cb8_424f_a327_6e7ef4e4d775.slice/crio-e74515b96fc5b1b6d1708a6223eb1e4dd8c20dddeee25e960edd582e66f5fe0c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-f749bd7cf5f66438ede8bf1b3c20e8526768b32092eadfbe81f4a42d31a02603.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4696d4e_6124_4bcc_b257_651108f6b837.slice/crio-7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7082b53_1345_4c47_a9bf_b87d9e1fd3ca.slice/crio-883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod255451e0_9cb8_424f_a327_6e7ef4e4d775.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-8672f9d7879a455be9373c8a7415e8b1cbd39a91beaa6b8f12c45912ac609922.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-conmon-dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09a61a2b_7821_476f_af33_74837a0e2026.slice/crio-23a76757c4e9c694669e42f69462004741f841168e478fbc62cbd9c2dbd01401\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice/crio-dcca161363298132f9b2b40db355685590f9ffcbad34a70cba9d5f18153fa2e6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb4696d4e_6124_4bcc_b257_651108f6b837.slice/crio-conmon-7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31f1d628_34fa_4e75_8aa8_f3e724839ee8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7082b53_1345_4c47_a9bf_b87d9e1fd3ca.slice/crio-conmon-883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76.scope\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.439967 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.483016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") pod \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.483198 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") pod \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.483299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") pod \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\" (UID: \"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca\") " Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.488897 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs" (OuterVolumeSpecName: "kube-api-access-8rsbs") pod "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" (UID: "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca"). InnerVolumeSpecName "kube-api-access-8rsbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.513192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data" (OuterVolumeSpecName: "config-data") pod "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" (UID: "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.517089 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" (UID: "f7082b53-1345-4c47-a9bf-b87d9e1fd3ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.555859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerStarted","Data":"99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678"} Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.557190 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.559174 4790 generic.go:334] "Generic (PLEG): container finished" podID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" exitCode=137 Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.559642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.561046 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerDied","Data":"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76"} Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.561116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f7082b53-1345-4c47-a9bf-b87d9e1fd3ca","Type":"ContainerDied","Data":"51acefd6782a10757a2c7ce4b0059b90de6cb5277f1ef646ad641cd716c69ecd"} Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.561151 4790 scope.go:117] "RemoveContainer" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.589173 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.589214 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.589225 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rsbs\" (UniqueName: \"kubernetes.io/projected/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca-kube-api-access-8rsbs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.592221 4790 scope.go:117] "RemoveContainer" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" Mar 13 20:50:17 crc kubenswrapper[4790]: E0313 20:50:17.592596 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76\": container with ID starting with 883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76 not found: ID does not exist" containerID="883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.592641 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76"} err="failed to get container status \"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76\": rpc error: code = NotFound desc = could not find container \"883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76\": container with ID starting with 883c2d53541e6dec02432d4bf462952ea52e0782a0762ed4ac3e98fe13d01b76 not found: ID does not exist" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.604150 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.337494835 podStartE2EDuration="6.604128496s" podCreationTimestamp="2026-03-13 20:50:11 +0000 UTC" firstStartedPulling="2026-03-13 20:50:12.67442427 +0000 UTC m=+1343.695540161" lastFinishedPulling="2026-03-13 20:50:16.941057931 +0000 UTC m=+1347.962173822" observedRunningTime="2026-03-13 20:50:17.593480475 +0000 UTC m=+1348.614596376" watchObservedRunningTime="2026-03-13 20:50:17.604128496 +0000 UTC m=+1348.625244397" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.616337 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.623899 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.643772 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: E0313 20:50:17.644200 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.644221 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.644441 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" containerName="nova-cell1-novncproxy-novncproxy" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.645002 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.647421 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.647770 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.648665 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.658543 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.699207 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7082b53-1345-4c47-a9bf-b87d9e1fd3ca" path="/var/lib/kubelet/pods/f7082b53-1345-4c47-a9bf-b87d9e1fd3ca/volumes" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.749336 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.749772 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.796775 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.796974 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.797027 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.797061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.797115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8plzs\" (UniqueName: \"kubernetes.io/projected/20c0842a-c69a-4af0-aef0-ffec3f3560bc-kube-api-access-8plzs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.817520 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.898807 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.898868 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8plzs\" (UniqueName: \"kubernetes.io/projected/20c0842a-c69a-4af0-aef0-ffec3f3560bc-kube-api-access-8plzs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.898949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.899035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.899059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.903813 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.904605 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.908148 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.915505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/20c0842a-c69a-4af0-aef0-ffec3f3560bc-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.919230 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8plzs\" (UniqueName: \"kubernetes.io/projected/20c0842a-c69a-4af0-aef0-ffec3f3560bc-kube-api-access-8plzs\") pod \"nova-cell1-novncproxy-0\" (UID: \"20c0842a-c69a-4af0-aef0-ffec3f3560bc\") " pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:17 crc kubenswrapper[4790]: I0313 20:50:17.976371 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:18 crc kubenswrapper[4790]: I0313 20:50:18.418569 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 13 20:50:18 crc kubenswrapper[4790]: I0313 20:50:18.569057 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20c0842a-c69a-4af0-aef0-ffec3f3560bc","Type":"ContainerStarted","Data":"f4861d9a1f81670a88d5e3a86fe1788bb4d44171315713afe4aacb565a42102e"} Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.580281 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"20c0842a-c69a-4af0-aef0-ffec3f3560bc","Type":"ContainerStarted","Data":"649f54ab6c2e84e74d811136c5f4c779bf01a09a300962f5507f0c484fdeb533"} Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.614339 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.614320813 podStartE2EDuration="2.614320813s" podCreationTimestamp="2026-03-13 20:50:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:19.604020762 +0000 UTC m=+1350.625136653" watchObservedRunningTime="2026-03-13 20:50:19.614320813 +0000 UTC m=+1350.635436704" Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.754181 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.757508 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:50:19 crc kubenswrapper[4790]: I0313 20:50:19.761766 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.593422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.824947 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.826782 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.855882 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968630 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968793 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968837 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968867 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:20 crc kubenswrapper[4790]: I0313 20:50:20.968904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070306 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070370 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070429 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070456 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.070528 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071525 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071561 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071545 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.071926 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.089239 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"dnsmasq-dns-89c5cd4d5-kqzmj\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.153155 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:21 crc kubenswrapper[4790]: I0313 20:50:21.649313 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:50:21 crc kubenswrapper[4790]: W0313 20:50:21.650946 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod03ea3d76_1bca_44e8_986c_8e751336f93d.slice/crio-b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100 WatchSource:0}: Error finding container b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100: Status 404 returned error can't find the container with id b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100 Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.607008 4790 generic.go:334] "Generic (PLEG): container finished" podID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerID="3bf4c1a3a8959712b6bdc6bb2a33893090891a1211e7646c25c1b2fcadfa4181" exitCode=0 Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.607248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerDied","Data":"3bf4c1a3a8959712b6bdc6bb2a33893090891a1211e7646c25c1b2fcadfa4181"} Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.607702 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerStarted","Data":"b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100"} Mar 13 20:50:22 crc kubenswrapper[4790]: I0313 20:50:22.977529 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.162811 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.619577 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerStarted","Data":"4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29"} Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.619664 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" containerID="cri-o://f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.619961 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" containerID="cri-o://3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.620234 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.654858 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" podStartSLOduration=3.654838816 podStartE2EDuration="3.654838816s" podCreationTimestamp="2026-03-13 20:50:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:23.645517002 +0000 UTC m=+1354.666632893" watchObservedRunningTime="2026-03-13 20:50:23.654838816 +0000 UTC m=+1354.675954707" Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.731901 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.732227 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" containerID="cri-o://e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.734257 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" containerID="cri-o://99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.734568 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" containerID="cri-o://48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688" gracePeriod=30 Mar 13 20:50:23 crc kubenswrapper[4790]: I0313 20:50:23.735227 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" containerID="cri-o://5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e" gracePeriod=30 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.633412 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" exitCode=143 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.633762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerDied","Data":"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643097 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678" exitCode=0 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643130 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688" exitCode=2 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643140 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e" exitCode=0 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643148 4790 generic.go:334] "Generic (PLEG): container finished" podID="83ebcf30-733f-4074-9565-5582a160a8c3" containerID="e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7" exitCode=0 Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643928 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.643939 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7"} Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.924835 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981503 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981577 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981905 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.981985 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.982026 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.982057 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.982084 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") pod \"83ebcf30-733f-4074-9565-5582a160a8c3\" (UID: \"83ebcf30-733f-4074-9565-5582a160a8c3\") " Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.983477 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.983546 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.988327 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz" (OuterVolumeSpecName: "kube-api-access-fq8fz") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "kube-api-access-fq8fz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:24 crc kubenswrapper[4790]: I0313 20:50:24.999811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts" (OuterVolumeSpecName: "scripts") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.021031 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.063510 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084295 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084344 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084357 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq8fz\" (UniqueName: \"kubernetes.io/projected/83ebcf30-733f-4074-9565-5582a160a8c3-kube-api-access-fq8fz\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084393 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084403 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/83ebcf30-733f-4074-9565-5582a160a8c3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.084412 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.097979 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.119481 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data" (OuterVolumeSpecName: "config-data") pod "83ebcf30-733f-4074-9565-5582a160a8c3" (UID: "83ebcf30-733f-4074-9565-5582a160a8c3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.186059 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.186101 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83ebcf30-733f-4074-9565-5582a160a8c3-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.656955 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"83ebcf30-733f-4074-9565-5582a160a8c3","Type":"ContainerDied","Data":"437f52cb36b41910903372ec5bcd7008b8c1ad39f31664517f1ae136ab440e48"} Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.657033 4790 scope.go:117] "RemoveContainer" containerID="99a17e425d61cedf85a823b7432d78c2c5018dc3dd588da442972a911bda1678" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.657048 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.698250 4790 scope.go:117] "RemoveContainer" containerID="48ff744eb8ffe6277235dc1660b7e57a170e8dd57ea9c3e069bc4793c884f688" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.698418 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.708433 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.744559 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745081 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745104 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745124 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745134 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745146 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745153 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: E0313 20:50:25.745173 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745180 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745428 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="proxy-httpd" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745455 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-notification-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745470 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="sg-core" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.745485 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" containerName="ceilometer-central-agent" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.747541 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.749497 4790 scope.go:117] "RemoveContainer" containerID="5a70244bb2bfdbe5ee07eedcf58b20d49e38de788b1d508f2fb5f8344abf2f5e" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.752246 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.755242 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.755405 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.757006 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.795761 4790 scope.go:117] "RemoveContainer" containerID="e503f9a9feeac1738b128be87172b4c3f1ed19b15121b1befddaa40e1a7ba6f7" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811063 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811112 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811147 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811171 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811322 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811447 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.811469 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.913646 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.913712 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.913872 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914847 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914630 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914934 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.914962 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.915328 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.915355 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.919726 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.920076 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.920128 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.921753 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.925724 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:25 crc kubenswrapper[4790]: I0313 20:50:25.934437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"ceilometer-0\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " pod="openstack/ceilometer-0" Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.053855 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.054577 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.492074 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:50:26 crc kubenswrapper[4790]: W0313 20:50:26.493361 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddd8215d8_8b4d_4c20_a832_e2088825019b.slice/crio-18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c WatchSource:0}: Error finding container 18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c: Status 404 returned error can't find the container with id 18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c Mar 13 20:50:26 crc kubenswrapper[4790]: I0313 20:50:26.667462 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.381065 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588598 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588722 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588794 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.588935 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") pod \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\" (UID: \"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8\") " Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.591388 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs" (OuterVolumeSpecName: "logs") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.597094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb" (OuterVolumeSpecName: "kube-api-access-7mgtb") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "kube-api-access-7mgtb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.618562 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.621010 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.627210 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data" (OuterVolumeSpecName: "config-data") pod "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" (UID: "e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.677400 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83ebcf30-733f-4074-9565-5582a160a8c3" path="/var/lib/kubelet/pods/83ebcf30-733f-4074-9565-5582a160a8c3/volumes" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.680361 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684121 4790 generic.go:334] "Generic (PLEG): container finished" podID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" exitCode=0 Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerDied","Data":"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8","Type":"ContainerDied","Data":"d40e3fdf8db9cbcc4affa484642d64cee75cff31f6fe4e94fb4c91f6efd99014"} Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684201 4790 scope.go:117] "RemoveContainer" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.684336 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692201 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692233 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692248 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.692261 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mgtb\" (UniqueName: \"kubernetes.io/projected/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8-kube-api-access-7mgtb\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.714601 4790 scope.go:117] "RemoveContainer" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.723347 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.741281 4790 scope.go:117] "RemoveContainer" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.742232 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525\": container with ID starting with 3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525 not found: ID does not exist" containerID="3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.742291 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525"} err="failed to get container status \"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525\": rpc error: code = NotFound desc = could not find container \"3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525\": container with ID starting with 3acbf007eac3905d55ad51c638551ca8640a44de1fd0826375d08655e53af525 not found: ID does not exist" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.742327 4790 scope.go:117] "RemoveContainer" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.742869 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f\": container with ID starting with f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f not found: ID does not exist" containerID="f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.742918 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f"} err="failed to get container status \"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f\": rpc error: code = NotFound desc = could not find container \"f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f\": container with ID starting with f6262bd5acbaf3be3e4c60a3d813e7fa5d537ec3ac29e76405e12a6df134804f not found: ID does not exist" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.745180 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.765443 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.766020 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766039 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" Mar 13 20:50:27 crc kubenswrapper[4790]: E0313 20:50:27.766058 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766066 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766297 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-log" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.766331 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" containerName="nova-api-api" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.767591 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.769719 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.771542 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.771866 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.777226 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793508 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793563 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793606 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.793835 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.794418 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.895957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896007 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896059 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896089 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896141 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.896235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.897449 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.900085 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.900651 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.900717 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.905710 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.912548 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"nova-api-0\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " pod="openstack/nova-api-0" Mar 13 20:50:27 crc kubenswrapper[4790]: I0313 20:50:27.976912 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.013099 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.094851 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.535518 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:28 crc kubenswrapper[4790]: W0313 20:50:28.535955 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae43d767_425b_46ff_ba98_cc3dc9419ba5.slice/crio-edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82 WatchSource:0}: Error finding container edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82: Status 404 returned error can't find the container with id edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82 Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.697149 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009"} Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.700269 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerStarted","Data":"edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82"} Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.717963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.990302 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.991476 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.995155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 13 20:50:28 crc kubenswrapper[4790]: I0313 20:50:28.995597 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.004828 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020746 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020828 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020883 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.020914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.122873 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.123212 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.123250 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.123884 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.126695 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.127455 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.127892 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.139693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"nova-cell1-cell-mapping-sw4k5\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.322099 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.697794 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8" path="/var/lib/kubelet/pods/e5f74b87-8c4a-490f-ad9c-75ba17e3a1a8/volumes" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.733066 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5"} Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.737866 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerStarted","Data":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.737948 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerStarted","Data":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.816459 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.816436681 podStartE2EDuration="2.816436681s" podCreationTimestamp="2026-03-13 20:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:29.766701854 +0000 UTC m=+1360.787817745" watchObservedRunningTime="2026-03-13 20:50:29.816436681 +0000 UTC m=+1360.837552562" Mar 13 20:50:29 crc kubenswrapper[4790]: I0313 20:50:29.819816 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 20:50:30 crc kubenswrapper[4790]: I0313 20:50:30.748556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerStarted","Data":"4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570"} Mar 13 20:50:30 crc kubenswrapper[4790]: I0313 20:50:30.748885 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerStarted","Data":"4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896"} Mar 13 20:50:30 crc kubenswrapper[4790]: I0313 20:50:30.769116 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-sw4k5" podStartSLOduration=2.769098814 podStartE2EDuration="2.769098814s" podCreationTimestamp="2026-03-13 20:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:30.760657274 +0000 UTC m=+1361.781773165" watchObservedRunningTime="2026-03-13 20:50:30.769098814 +0000 UTC m=+1361.790214705" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.155756 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.282801 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.283512 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" containerID="cri-o://31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f" gracePeriod=10 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.773153 4790 generic.go:334] "Generic (PLEG): container finished" podID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerID="31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f" exitCode=0 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.773389 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerDied","Data":"31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f"} Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.791532 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerStarted","Data":"684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8"} Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.791754 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" containerID="cri-o://684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.791856 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" containerID="cri-o://7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.792021 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" containerID="cri-o://4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.792059 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" containerID="cri-o://32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69" gracePeriod=30 Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.792110 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.827499 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.004437636 podStartE2EDuration="6.827473261s" podCreationTimestamp="2026-03-13 20:50:25 +0000 UTC" firstStartedPulling="2026-03-13 20:50:26.495046682 +0000 UTC m=+1357.516162573" lastFinishedPulling="2026-03-13 20:50:31.318082307 +0000 UTC m=+1362.339198198" observedRunningTime="2026-03-13 20:50:31.819097602 +0000 UTC m=+1362.840213503" watchObservedRunningTime="2026-03-13 20:50:31.827473261 +0000 UTC m=+1362.848589172" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.961433 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.996882 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:31 crc kubenswrapper[4790]: I0313 20:50:31.997007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.054611 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.055193 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099110 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099578 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099634 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.099715 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") pod \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\" (UID: \"aa96d2ec-af8f-4ef3-96a2-108e003c669b\") " Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.100305 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.100330 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.102955 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj" (OuterVolumeSpecName: "kube-api-access-lxmdj") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "kube-api-access-lxmdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.148394 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.158572 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config" (OuterVolumeSpecName: "config") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.171255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aa96d2ec-af8f-4ef3-96a2-108e003c669b" (UID: "aa96d2ec-af8f-4ef3-96a2-108e003c669b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202341 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202394 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lxmdj\" (UniqueName: \"kubernetes.io/projected/aa96d2ec-af8f-4ef3-96a2-108e003c669b-kube-api-access-lxmdj\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202405 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.202416 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aa96d2ec-af8f-4ef3-96a2-108e003c669b-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.802499 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" event={"ID":"aa96d2ec-af8f-4ef3-96a2-108e003c669b","Type":"ContainerDied","Data":"d410e2281728cc5d35324b5d9753eac6a283696daed20ea1b6c874c3b410e22c"} Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.802570 4790 scope.go:117] "RemoveContainer" containerID="31569c9f9b3d97fb94632b2003b39bbe5006cf77f3a60db89747921488537e4f" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.802745 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-757b4f8459-jstn6" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807235 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5" exitCode=2 Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807265 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009" exitCode=0 Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807287 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5"} Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.807317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009"} Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.830054 4790 scope.go:117] "RemoveContainer" containerID="62406a3417f49cd6fee467ec15aafed59672de36ebec3945dba28321522a57f0" Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.870263 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:50:32 crc kubenswrapper[4790]: I0313 20:50:32.882191 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-757b4f8459-jstn6"] Mar 13 20:50:33 crc kubenswrapper[4790]: I0313 20:50:33.685360 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" path="/var/lib/kubelet/pods/aa96d2ec-af8f-4ef3-96a2-108e003c669b/volumes" Mar 13 20:50:33 crc kubenswrapper[4790]: I0313 20:50:33.818682 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69" exitCode=0 Mar 13 20:50:33 crc kubenswrapper[4790]: I0313 20:50:33.818771 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69"} Mar 13 20:50:35 crc kubenswrapper[4790]: I0313 20:50:35.840977 4790 generic.go:334] "Generic (PLEG): container finished" podID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerID="4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570" exitCode=0 Mar 13 20:50:35 crc kubenswrapper[4790]: I0313 20:50:35.841034 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerDied","Data":"4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570"} Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.198729 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301072 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301171 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301217 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.301282 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") pod \"263e3744-6b98-4d91-aba2-cd28a616d9df\" (UID: \"263e3744-6b98-4d91-aba2-cd28a616d9df\") " Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.306781 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts" (OuterVolumeSpecName: "scripts") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.307061 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt" (OuterVolumeSpecName: "kube-api-access-p2jlt") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "kube-api-access-p2jlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.327099 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data" (OuterVolumeSpecName: "config-data") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.332879 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "263e3744-6b98-4d91-aba2-cd28a616d9df" (UID: "263e3744-6b98-4d91-aba2-cd28a616d9df"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403409 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403461 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403477 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2jlt\" (UniqueName: \"kubernetes.io/projected/263e3744-6b98-4d91-aba2-cd28a616d9df-kube-api-access-p2jlt\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.403492 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/263e3744-6b98-4d91-aba2-cd28a616d9df-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:37 crc kubenswrapper[4790]: E0313 20:50:37.826737 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod263e3744_6b98_4d91_aba2_cd28a616d9df.slice/crio-4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod263e3744_6b98_4d91_aba2_cd28a616d9df.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.858750 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-sw4k5" event={"ID":"263e3744-6b98-4d91-aba2-cd28a616d9df","Type":"ContainerDied","Data":"4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896"} Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.859157 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4091775fd736a14082c9f6fe75f707311111beffcfee7aa2afc5e6278dd2f896" Mar 13 20:50:37 crc kubenswrapper[4790]: I0313 20:50:37.858828 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-sw4k5" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.038980 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.039260 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" containerID="cri-o://94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.050799 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.051062 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" containerID="cri-o://7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.051143 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" containerID="cri-o://fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.124114 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.124433 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" containerID="cri-o://8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.124567 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" containerID="cri-o://ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" gracePeriod=30 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.632141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.719029 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.721870 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.724130 4790 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.724205 4790 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.743298 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.743396 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744063 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744141 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.744173 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") pod \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\" (UID: \"ae43d767-425b-46ff-ba98-cc3dc9419ba5\") " Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.745205 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs" (OuterVolumeSpecName: "logs") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.749034 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh" (OuterVolumeSpecName: "kube-api-access-xxzbh") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "kube-api-access-xxzbh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.777653 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.778489 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data" (OuterVolumeSpecName: "config-data") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.799549 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.818857 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "ae43d767-425b-46ff-ba98-cc3dc9419ba5" (UID: "ae43d767-425b-46ff-ba98-cc3dc9419ba5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847323 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847693 4790 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847754 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae43d767-425b-46ff-ba98-cc3dc9419ba5-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847804 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxzbh\" (UniqueName: \"kubernetes.io/projected/ae43d767-425b-46ff-ba98-cc3dc9419ba5-kube-api-access-xxzbh\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847868 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.847919 4790 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae43d767-425b-46ff-ba98-cc3dc9419ba5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.870351 4790 generic.go:334] "Generic (PLEG): container finished" podID="b6868acd-5476-49b4-958c-8f68fde161b9" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" exitCode=143 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.870456 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerDied","Data":"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873815 4790 generic.go:334] "Generic (PLEG): container finished" podID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" exitCode=0 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873907 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerDied","Data":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerDied","Data":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873901 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.874023 4790 scope.go:117] "RemoveContainer" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.873940 4790 generic.go:334] "Generic (PLEG): container finished" podID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" exitCode=143 Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.874242 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ae43d767-425b-46ff-ba98-cc3dc9419ba5","Type":"ContainerDied","Data":"edf979dc93b443df34ce2953c0ac6333708a8588e34d9da9b7381d8bf4639a82"} Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.908738 4790 scope.go:117] "RemoveContainer" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.919051 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.929444 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.939759 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.940456 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="init" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.940550 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="init" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.940642 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerName="nova-manage" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.940724 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerName="nova-manage" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.940806 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.940910 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.941009 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941077 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.941149 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941216 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941541 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-api" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941623 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" containerName="nova-manage" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941695 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa96d2ec-af8f-4ef3-96a2-108e003c669b" containerName="dnsmasq-dns" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.941760 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" containerName="nova-api-log" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.942807 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.945618 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.947333 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.949875 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.951960 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.972029 4790 scope.go:117] "RemoveContainer" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.974790 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": container with ID starting with fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11 not found: ID does not exist" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.974878 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} err="failed to get container status \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": rpc error: code = NotFound desc = could not find container \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": container with ID starting with fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11 not found: ID does not exist" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.974918 4790 scope.go:117] "RemoveContainer" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: E0313 20:50:38.975294 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": container with ID starting with 7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431 not found: ID does not exist" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.975318 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} err="failed to get container status \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": rpc error: code = NotFound desc = could not find container \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": container with ID starting with 7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431 not found: ID does not exist" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.975332 4790 scope.go:117] "RemoveContainer" containerID="fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.976211 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11"} err="failed to get container status \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": rpc error: code = NotFound desc = could not find container \"fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11\": container with ID starting with fab3369d6a562270202b6cf09200bf8c99988b0e29c29c79e0aa139b46645d11 not found: ID does not exist" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.976369 4790 scope.go:117] "RemoveContainer" containerID="7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431" Mar 13 20:50:38 crc kubenswrapper[4790]: I0313 20:50:38.980576 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431"} err="failed to get container status \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": rpc error: code = NotFound desc = could not find container \"7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431\": container with ID starting with 7b3bae176e5eadf686e05b6fcf19f6b8fdb448b7174313826c3542d2922c3431 not found: ID does not exist" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.051886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-config-data\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052221 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-logs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052760 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-public-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.052848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-445zd\" (UniqueName: \"kubernetes.io/projected/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-kube-api-access-445zd\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.155875 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-config-data\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.155968 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156013 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156108 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-logs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156140 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-public-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156207 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-445zd\" (UniqueName: \"kubernetes.io/projected/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-kube-api-access-445zd\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.156991 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-logs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.160281 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-config-data\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.160977 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-public-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.162963 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.164775 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.177598 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-445zd\" (UniqueName: \"kubernetes.io/projected/4597d91c-0f9f-4e33-aaa7-b25e7076e13a-kube-api-access-445zd\") pod \"nova-api-0\" (UID: \"4597d91c-0f9f-4e33-aaa7-b25e7076e13a\") " pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.268141 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.670627 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae43d767-425b-46ff-ba98-cc3dc9419ba5" path="/var/lib/kubelet/pods/ae43d767-425b-46ff-ba98-cc3dc9419ba5/volumes" Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.706189 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.901622 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4597d91c-0f9f-4e33-aaa7-b25e7076e13a","Type":"ContainerStarted","Data":"3f30e0061e280a401241f19ee12e494cd4b030bd562acd0b72ded4501c35c83e"} Mar 13 20:50:39 crc kubenswrapper[4790]: I0313 20:50:39.901683 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4597d91c-0f9f-4e33-aaa7-b25e7076e13a","Type":"ContainerStarted","Data":"ff0150884b11d59c283f22874c57891380471d45835859210f6d8343f899227b"} Mar 13 20:50:40 crc kubenswrapper[4790]: I0313 20:50:40.915459 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4597d91c-0f9f-4e33-aaa7-b25e7076e13a","Type":"ContainerStarted","Data":"66096be8201acafa7bf92ee6f89be2e7ad60634981a3b1ad6019d36671094cb8"} Mar 13 20:50:40 crc kubenswrapper[4790]: I0313 20:50:40.941942 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.941919523 podStartE2EDuration="2.941919523s" podCreationTimestamp="2026-03-13 20:50:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:40.932968978 +0000 UTC m=+1371.954084869" watchObservedRunningTime="2026-03-13 20:50:40.941919523 +0000 UTC m=+1371.963035414" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.810976 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911205 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911258 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911285 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911445 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911549 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") pod \"b6868acd-5476-49b4-958c-8f68fde161b9\" (UID: \"b6868acd-5476-49b4-958c-8f68fde161b9\") " Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.911839 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs" (OuterVolumeSpecName: "logs") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.912241 4790 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b6868acd-5476-49b4-958c-8f68fde161b9-logs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.916579 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4" (OuterVolumeSpecName: "kube-api-access-g45p4") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "kube-api-access-g45p4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.926875 4790 generic.go:334] "Generic (PLEG): container finished" podID="b6868acd-5476-49b4-958c-8f68fde161b9" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" exitCode=0 Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.926959 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerDied","Data":"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498"} Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.926988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b6868acd-5476-49b4-958c-8f68fde161b9","Type":"ContainerDied","Data":"1998e9c0ed2f49c51df1fb979275385f3c3c928b8ffbba368fee9881d45e3a34"} Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.927006 4790 scope.go:117] "RemoveContainer" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.927043 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.937715 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data" (OuterVolumeSpecName: "config-data") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.953564 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:41 crc kubenswrapper[4790]: I0313 20:50:41.973553 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b6868acd-5476-49b4-958c-8f68fde161b9" (UID: "b6868acd-5476-49b4-958c-8f68fde161b9"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016687 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016752 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g45p4\" (UniqueName: \"kubernetes.io/projected/b6868acd-5476-49b4-958c-8f68fde161b9-kube-api-access-g45p4\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016765 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.016777 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b6868acd-5476-49b4-958c-8f68fde161b9-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.033826 4790 scope.go:117] "RemoveContainer" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.054545 4790 scope.go:117] "RemoveContainer" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.054969 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498\": container with ID starting with ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498 not found: ID does not exist" containerID="ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.055033 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498"} err="failed to get container status \"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498\": rpc error: code = NotFound desc = could not find container \"ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498\": container with ID starting with ad33f248352ae709d9f930ce51ab0e5fe04a2c01c0c2bd3fe99755d295f43498 not found: ID does not exist" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.055064 4790 scope.go:117] "RemoveContainer" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.055335 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67\": container with ID starting with 8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67 not found: ID does not exist" containerID="8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.055385 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67"} err="failed to get container status \"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67\": rpc error: code = NotFound desc = could not find container \"8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67\": container with ID starting with 8c91b80dd9fc2bebbbb77137858b42a65b5f891741651f14f355153816248d67 not found: ID does not exist" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.256844 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.306192 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.313334 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.313967 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.313993 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.314023 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.314032 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.314294 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-log" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.314319 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" containerName="nova-metadata-metadata" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.315469 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.318742 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.318815 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.332564 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.423987 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b43558-bdf4-45e4-b1bc-6e9b325e163b-logs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424367 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424446 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-config-data\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.424530 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r52ms\" (UniqueName: \"kubernetes.io/projected/00b43558-bdf4-45e4-b1bc-6e9b325e163b-kube-api-access-r52ms\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525750 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b43558-bdf4-45e4-b1bc-6e9b325e163b-logs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525833 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525883 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-config-data\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525913 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.525933 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r52ms\" (UniqueName: \"kubernetes.io/projected/00b43558-bdf4-45e4-b1bc-6e9b325e163b-kube-api-access-r52ms\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.526949 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/00b43558-bdf4-45e4-b1bc-6e9b325e163b-logs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.529992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-config-data\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.530162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.530699 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00b43558-bdf4-45e4-b1bc-6e9b325e163b-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.543800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r52ms\" (UniqueName: \"kubernetes.io/projected/00b43558-bdf4-45e4-b1bc-6e9b325e163b-kube-api-access-r52ms\") pod \"nova-metadata-0\" (UID: \"00b43558-bdf4-45e4-b1bc-6e9b325e163b\") " pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.638559 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.754082 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.834863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") pod \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.834930 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") pod \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.835005 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") pod \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\" (UID: \"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4\") " Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.843118 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd" (OuterVolumeSpecName: "kube-api-access-x7pzd") pod "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" (UID: "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4"). InnerVolumeSpecName "kube-api-access-x7pzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.868854 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data" (OuterVolumeSpecName: "config-data") pod "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" (UID: "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.871880 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" (UID: "dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.937061 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7pzd\" (UniqueName: \"kubernetes.io/projected/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-kube-api-access-x7pzd\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.937100 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.937112 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938316 4790 generic.go:334] "Generic (PLEG): container finished" podID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" exitCode=0 Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938346 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerDied","Data":"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e"} Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938397 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4","Type":"ContainerDied","Data":"6e7ed25e629647fdcbcdadd57d668b2cdfbe95f3b450a5732f265251e324ddb9"} Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938417 4790 scope.go:117] "RemoveContainer" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.938536 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.973811 4790 scope.go:117] "RemoveContainer" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" Mar 13 20:50:42 crc kubenswrapper[4790]: E0313 20:50:42.974221 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e\": container with ID starting with 94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e not found: ID does not exist" containerID="94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.974250 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e"} err="failed to get container status \"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e\": rpc error: code = NotFound desc = could not find container \"94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e\": container with ID starting with 94e4e193dd0d983e08f453cddbe2b0faab79c8ba2888d153be176bd0443bce4e not found: ID does not exist" Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.978241 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:42 crc kubenswrapper[4790]: I0313 20:50:42.988063 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.002356 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: E0313 20:50:43.002790 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.002803 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.002992 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" containerName="nova-scheduler-scheduler" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.003745 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.006111 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.011567 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.076569 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.139732 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jqk4\" (UniqueName: \"kubernetes.io/projected/01e86425-f126-4827-b727-e8c73d152aa6-kube-api-access-5jqk4\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.139785 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-config-data\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.139820 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.241179 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jqk4\" (UniqueName: \"kubernetes.io/projected/01e86425-f126-4827-b727-e8c73d152aa6-kube-api-access-5jqk4\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.241540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-config-data\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.241591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.245266 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.245710 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/01e86425-f126-4827-b727-e8c73d152aa6-config-data\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.261694 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jqk4\" (UniqueName: \"kubernetes.io/projected/01e86425-f126-4827-b727-e8c73d152aa6-kube-api-access-5jqk4\") pod \"nova-scheduler-0\" (UID: \"01e86425-f126-4827-b727-e8c73d152aa6\") " pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.321650 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.670515 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6868acd-5476-49b4-958c-8f68fde161b9" path="/var/lib/kubelet/pods/b6868acd-5476-49b4-958c-8f68fde161b9/volumes" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.671505 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4" path="/var/lib/kubelet/pods/dc78a7af-7e1d-4fd1-b868-47b1dc5db4e4/volumes" Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.801555 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.947473 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01e86425-f126-4827-b727-e8c73d152aa6","Type":"ContainerStarted","Data":"19b8700ac8cce409d32085026c5d7ffcaf0b8e22f583b1a8ade9d67819c332dd"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.949549 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00b43558-bdf4-45e4-b1bc-6e9b325e163b","Type":"ContainerStarted","Data":"e5f1dcb2f34dfe470d97b1c12abcb44998274cb922a1ab2d2504d7e0b8c6bc9e"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.949572 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00b43558-bdf4-45e4-b1bc-6e9b325e163b","Type":"ContainerStarted","Data":"9a99e76dfaa85feb5f4523b3781167ab6fc1367c7b8e8e94f4e06e4e02c29d88"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.949582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"00b43558-bdf4-45e4-b1bc-6e9b325e163b","Type":"ContainerStarted","Data":"6f9eb52c8c676828f1c30311e3603b02d4889a09c5d60400352372d8d2e38285"} Mar 13 20:50:43 crc kubenswrapper[4790]: I0313 20:50:43.972108 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.9720813069999998 podStartE2EDuration="1.972081307s" podCreationTimestamp="2026-03-13 20:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:43.967189375 +0000 UTC m=+1374.988305276" watchObservedRunningTime="2026-03-13 20:50:43.972081307 +0000 UTC m=+1374.993197188" Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.015410 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.015472 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.973746 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"01e86425-f126-4827-b727-e8c73d152aa6","Type":"ContainerStarted","Data":"c6dd172d17edfa4d9ed750f7157dd649bef6a98b28de8ea3e32b681e355e3615"} Mar 13 20:50:44 crc kubenswrapper[4790]: I0313 20:50:44.993117 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.993101106 podStartE2EDuration="2.993101106s" podCreationTimestamp="2026-03-13 20:50:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:50:44.989664312 +0000 UTC m=+1376.010780213" watchObservedRunningTime="2026-03-13 20:50:44.993101106 +0000 UTC m=+1376.014216997" Mar 13 20:50:48 crc kubenswrapper[4790]: E0313 20:50:48.051001 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode71d98c3_e247_448e_945e_016a6755c689.slice/crio-55f3196c901a679f999ea7048b99d1e69e5d8f8dcae2885a569b98a151420968\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod37e33a9e_1def_49b1_b1a7_81be1f5e72ee.slice/crio-a7693eddaf0a22601e6dc9f54784ec4f74f708b3aed816092645a24ca4db0419\": RecentStats: unable to find data in memory cache]" Mar 13 20:50:48 crc kubenswrapper[4790]: I0313 20:50:48.322794 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 13 20:50:49 crc kubenswrapper[4790]: I0313 20:50:49.268657 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:49 crc kubenswrapper[4790]: I0313 20:50:49.268733 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 13 20:50:50 crc kubenswrapper[4790]: I0313 20:50:50.280588 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4597d91c-0f9f-4e33-aaa7-b25e7076e13a" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:50 crc kubenswrapper[4790]: I0313 20:50:50.280774 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4597d91c-0f9f-4e33-aaa7-b25e7076e13a" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.210:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:52 crc kubenswrapper[4790]: I0313 20:50:52.639098 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:52 crc kubenswrapper[4790]: I0313 20:50:52.639193 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 13 20:50:52 crc kubenswrapper[4790]: I0313 20:50:52.707176 4790 scope.go:117] "RemoveContainer" containerID="721d15acd59eb0b2b9f8d48eaa51f02f0b2b5cc626d1243f5a398968f008ce5a" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.322168 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.357259 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.652615 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="00b43558-bdf4-45e4-b1bc-6e9b325e163b" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:53 crc kubenswrapper[4790]: I0313 20:50:53.653059 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="00b43558-bdf4-45e4-b1bc-6e9b325e163b" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 20:50:54 crc kubenswrapper[4790]: I0313 20:50:54.088991 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 13 20:50:56 crc kubenswrapper[4790]: I0313 20:50:56.064893 4790 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 13 20:50:57 crc kubenswrapper[4790]: I0313 20:50:57.268421 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:57 crc kubenswrapper[4790]: I0313 20:50:57.268480 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 13 20:50:57 crc kubenswrapper[4790]: I0313 20:50:57.695779 4790 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pode5f74b87-8c4a-490f-ad9c-75ba17e3a1a8"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pode5f74b87-8c4a-490f-ad9c-75ba17e3a1a8] : Timed out while waiting for systemd to remove kubepods-besteffort-pode5f74b87_8c4a_490f_ad9c_75ba17e3a1a8.slice" Mar 13 20:50:59 crc kubenswrapper[4790]: I0313 20:50:59.274739 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:59 crc kubenswrapper[4790]: I0313 20:50:59.277005 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 13 20:50:59 crc kubenswrapper[4790]: I0313 20:50:59.283416 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:51:00 crc kubenswrapper[4790]: I0313 20:51:00.116256 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 13 20:51:00 crc kubenswrapper[4790]: I0313 20:51:00.639749 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:51:00 crc kubenswrapper[4790]: I0313 20:51:00.640151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129695 4790 generic.go:334] "Generic (PLEG): container finished" podID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerID="684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8" exitCode=137 Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129742 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8"} Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129960 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"dd8215d8-8b4d-4c20-a832-e2088825019b","Type":"ContainerDied","Data":"18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c"} Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.129975 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18dee079958c9239905a09047fe9e0fae646c0f6d6b8cea56e2986dac7e9414c" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.181304 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244693 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244754 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244805 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244844 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.244883 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") pod \"dd8215d8-8b4d-4c20-a832-e2088825019b\" (UID: \"dd8215d8-8b4d-4c20-a832-e2088825019b\") " Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.246208 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.246322 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.255714 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9" (OuterVolumeSpecName: "kube-api-access-xq5f9") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "kube-api-access-xq5f9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.256538 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts" (OuterVolumeSpecName: "scripts") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.281243 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.294363 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.318620 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.343485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data" (OuterVolumeSpecName: "config-data") pod "dd8215d8-8b4d-4c20-a832-e2088825019b" (UID: "dd8215d8-8b4d-4c20-a832-e2088825019b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347916 4790 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-scripts\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347951 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xq5f9\" (UniqueName: \"kubernetes.io/projected/dd8215d8-8b4d-4c20-a832-e2088825019b-kube-api-access-xq5f9\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347965 4790 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347977 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347987 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.347998 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dd8215d8-8b4d-4c20-a832-e2088825019b-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.348008 4790 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.348019 4790 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/dd8215d8-8b4d-4c20-a832-e2088825019b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.646630 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.650220 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 13 20:51:02 crc kubenswrapper[4790]: I0313 20:51:02.656676 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.137642 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.142993 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.187429 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.212537 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.227530 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228073 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228097 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228122 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228132 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228154 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228163 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" Mar 13 20:51:03 crc kubenswrapper[4790]: E0313 20:51:03.228186 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228195 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228418 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="sg-core" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228451 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="proxy-httpd" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228469 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-central-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.228491 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" containerName="ceilometer-notification-agent" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.230418 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.235680 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.236208 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.236531 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.243312 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367109 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367158 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-scripts\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367461 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-run-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367484 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-log-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367527 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-config-data\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367544 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.367578 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fv28v\" (UniqueName: \"kubernetes.io/projected/d2645f50-482e-487d-9b16-c2a066630480-kube-api-access-fv28v\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469308 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-run-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469360 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-log-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-config-data\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469505 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469559 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fv28v\" (UniqueName: \"kubernetes.io/projected/d2645f50-482e-487d-9b16-c2a066630480-kube-api-access-fv28v\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469619 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.469702 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-scripts\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.470020 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-log-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.470029 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d2645f50-482e-487d-9b16-c2a066630480-run-httpd\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.475162 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-scripts\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.475255 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.475706 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.478117 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.484001 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2645f50-482e-487d-9b16-c2a066630480-config-data\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.490776 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fv28v\" (UniqueName: \"kubernetes.io/projected/d2645f50-482e-487d-9b16-c2a066630480-kube-api-access-fv28v\") pod \"ceilometer-0\" (UID: \"d2645f50-482e-487d-9b16-c2a066630480\") " pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.554548 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 13 20:51:03 crc kubenswrapper[4790]: I0313 20:51:03.680953 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8215d8-8b4d-4c20-a832-e2088825019b" path="/var/lib/kubelet/pods/dd8215d8-8b4d-4c20-a832-e2088825019b/volumes" Mar 13 20:51:04 crc kubenswrapper[4790]: W0313 20:51:04.019024 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2645f50_482e_487d_9b16_c2a066630480.slice/crio-7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c WatchSource:0}: Error finding container 7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c: Status 404 returned error can't find the container with id 7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c Mar 13 20:51:04 crc kubenswrapper[4790]: I0313 20:51:04.028451 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 13 20:51:04 crc kubenswrapper[4790]: I0313 20:51:04.148792 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"7ec42e3748933bdc74241b6eb424471bc69ad21f327f83d544cb366107361b3c"} Mar 13 20:51:05 crc kubenswrapper[4790]: I0313 20:51:05.158648 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"feffa8201208f291d13abd35b2c2dd546d70c74d9baf779e596335b07c113551"} Mar 13 20:51:06 crc kubenswrapper[4790]: I0313 20:51:06.167917 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"ea74cefcade4fe49ce3766e5d583e56536dfa46f88ac3f98686e4aaac580a73a"} Mar 13 20:51:06 crc kubenswrapper[4790]: I0313 20:51:06.169219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"c30e2102628823eab1ee3054424c04fbd3251b4b6386adcb17330147bbd91bd2"} Mar 13 20:51:09 crc kubenswrapper[4790]: I0313 20:51:09.209859 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d2645f50-482e-487d-9b16-c2a066630480","Type":"ContainerStarted","Data":"d6f7a91d9497dc21a044074639ef6035928f9b7d66af07297fc5f6dc5a406499"} Mar 13 20:51:09 crc kubenswrapper[4790]: I0313 20:51:09.210414 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 13 20:51:09 crc kubenswrapper[4790]: I0313 20:51:09.245076 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.041921117 podStartE2EDuration="6.245055046s" podCreationTimestamp="2026-03-13 20:51:03 +0000 UTC" firstStartedPulling="2026-03-13 20:51:04.025018692 +0000 UTC m=+1395.046134583" lastFinishedPulling="2026-03-13 20:51:08.228152621 +0000 UTC m=+1399.249268512" observedRunningTime="2026-03-13 20:51:09.228423722 +0000 UTC m=+1400.249539633" watchObservedRunningTime="2026-03-13 20:51:09.245055046 +0000 UTC m=+1400.266170937" Mar 13 20:51:14 crc kubenswrapper[4790]: I0313 20:51:14.015692 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:51:14 crc kubenswrapper[4790]: I0313 20:51:14.016331 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:51:33 crc kubenswrapper[4790]: I0313 20:51:33.561784 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 13 20:51:43 crc kubenswrapper[4790]: I0313 20:51:43.007419 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.015827 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016138 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016183 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016915 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.016992 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb" gracePeriod=600 Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.073169 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545285 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb" exitCode=0 Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545368 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb"} Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545492 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e"} Mar 13 20:51:44 crc kubenswrapper[4790]: I0313 20:51:44.545514 4790 scope.go:117] "RemoveContainer" containerID="232d637183e61cb15eeba88ed1e9cabcbc6f085073f5f974ddeeeb1a6f8eb83c" Mar 13 20:51:47 crc kubenswrapper[4790]: I0313 20:51:47.749329 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" containerID="cri-o://b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" gracePeriod=604796 Mar 13 20:51:48 crc kubenswrapper[4790]: I0313 20:51:48.352953 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" containerID="cri-o://9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4" gracePeriod=604796 Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.308530 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.348886 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.348947 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349037 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349095 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349232 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349257 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349298 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349320 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349342 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.349357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") pod \"e50b80fb-2251-49e7-a285-1276dbaa3237\" (UID: \"e50b80fb-2251-49e7-a285-1276dbaa3237\") " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.355707 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q" (OuterVolumeSpecName: "kube-api-access-2zf2q") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "kube-api-access-2zf2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.355938 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.356428 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.356434 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.356975 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "persistence") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.357231 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.362127 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info" (OuterVolumeSpecName: "pod-info") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.367011 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.383267 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data" (OuterVolumeSpecName: "config-data") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.424255 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf" (OuterVolumeSpecName: "server-conf") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452799 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452833 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452842 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452853 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/e50b80fb-2251-49e7-a285-1276dbaa3237-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452861 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452869 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452877 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/e50b80fb-2251-49e7-a285-1276dbaa3237-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452884 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/e50b80fb-2251-49e7-a285-1276dbaa3237-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452892 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zf2q\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-kube-api-access-2zf2q\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.452920 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.458326 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "e50b80fb-2251-49e7-a285-1276dbaa3237" (UID: "e50b80fb-2251-49e7-a285-1276dbaa3237"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.481160 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.554275 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.554312 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/e50b80fb-2251-49e7-a285-1276dbaa3237-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855111 4790 generic.go:334] "Generic (PLEG): container finished" podID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" exitCode=0 Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855203 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855231 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerDied","Data":"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e"} Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"e50b80fb-2251-49e7-a285-1276dbaa3237","Type":"ContainerDied","Data":"6218f617d211db14656d09a088c6de02a6677348fa07bdf9d142d99af0111ad7"} Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.855568 4790 scope.go:117] "RemoveContainer" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.857928 4790 generic.go:334] "Generic (PLEG): container finished" podID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerID="9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4" exitCode=0 Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.857980 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerDied","Data":"9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4"} Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.892675 4790 scope.go:117] "RemoveContainer" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.900603 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.919315 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.940292 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.940780 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.940797 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.940854 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="setup-container" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.940863 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="setup-container" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.941062 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" containerName="rabbitmq" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.942314 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.950818 4790 scope.go:117] "RemoveContainer" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.954941 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.956538 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e\": container with ID starting with b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e not found: ID does not exist" containerID="b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.956583 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e"} err="failed to get container status \"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e\": rpc error: code = NotFound desc = could not find container \"b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e\": container with ID starting with b18051e1928d6d4be1f49a88b66c5526904bec3db8483f41365c79d187155b0e not found: ID does not exist" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.956612 4790 scope.go:117] "RemoveContainer" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957634 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957794 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.957927 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-bssvd" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.958088 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.958114 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 13 20:51:54 crc kubenswrapper[4790]: E0313 20:51:54.971776 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde\": container with ID starting with e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde not found: ID does not exist" containerID="e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.971841 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde"} err="failed to get container status \"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde\": rpc error: code = NotFound desc = could not find container \"e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde\": container with ID starting with e8486a086425a64010be822959f267eda3cb5597406c8e8b2ac6ed4829dcbdde not found: ID does not exist" Mar 13 20:51:54 crc kubenswrapper[4790]: I0313 20:51:54.976890 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063783 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72ed8a4f-a46a-4e41-9335-f10dc6338627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063843 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063904 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72ed8a4f-a46a-4e41-9335-f10dc6338627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063938 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.063997 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctjqk\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-kube-api-access-ctjqk\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064052 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064100 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-config-data\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064143 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064170 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064197 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.064234 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165164 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165479 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165529 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165608 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72ed8a4f-a46a-4e41-9335-f10dc6338627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165633 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165678 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72ed8a4f-a46a-4e41-9335-f10dc6338627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165704 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165753 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctjqk\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-kube-api-access-ctjqk\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165800 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165835 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-config-data\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165862 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.165879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.166400 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.166472 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.166592 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.167017 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.167101 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-config-data\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.167612 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/72ed8a4f-a46a-4e41-9335-f10dc6338627-server-conf\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.171954 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/72ed8a4f-a46a-4e41-9335-f10dc6338627-pod-info\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.172094 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.172212 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.176886 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/72ed8a4f-a46a-4e41-9335-f10dc6338627-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.183431 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctjqk\" (UniqueName: \"kubernetes.io/projected/72ed8a4f-a46a-4e41-9335-f10dc6338627-kube-api-access-ctjqk\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.211039 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"rabbitmq-server-0\" (UID: \"72ed8a4f-a46a-4e41-9335-f10dc6338627\") " pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.266895 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.266943 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267027 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267079 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267112 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267157 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267177 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267197 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267247 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267300 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267355 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") pod \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\" (UID: \"c575f482-56cd-4dfc-84c6-c6bb922d56a9\") " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267527 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267703 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.267830 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.268121 4790 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.268143 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.268154 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.270862 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.273124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.279822 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b" (OuterVolumeSpecName: "kube-api-access-skg8b") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "kube-api-access-skg8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.283932 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.293548 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data" (OuterVolumeSpecName: "config-data") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.293664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info" (OuterVolumeSpecName: "pod-info") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.310324 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.331195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf" (OuterVolumeSpecName: "server-conf") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370180 4790 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-server-conf\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370222 4790 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c575f482-56cd-4dfc-84c6-c6bb922d56a9-pod-info\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370236 4790 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c575f482-56cd-4dfc-84c6-c6bb922d56a9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370250 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370261 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c575f482-56cd-4dfc-84c6-c6bb922d56a9-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370288 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.370301 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skg8b\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-kube-api-access-skg8b\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.397711 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.420339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "c575f482-56cd-4dfc-84c6-c6bb922d56a9" (UID: "c575f482-56cd-4dfc-84c6-c6bb922d56a9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.473294 4790 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c575f482-56cd-4dfc-84c6-c6bb922d56a9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.473328 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.671447 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e50b80fb-2251-49e7-a285-1276dbaa3237" path="/var/lib/kubelet/pods/e50b80fb-2251-49e7-a285-1276dbaa3237/volumes" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.769288 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.869568 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c575f482-56cd-4dfc-84c6-c6bb922d56a9","Type":"ContainerDied","Data":"dd3eb9a0e5bdb0287eed7cfa261bf8f63d9daa5df053b0925d31bc794e7ad761"} Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.869599 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.869640 4790 scope.go:117] "RemoveContainer" containerID="9121b19136cbb4acb4e68cfa3615a87c401027dec9c5c50a951e3a05a6de57b4" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.872324 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerStarted","Data":"20ca964e1e08449d26c45c0061f210e92a451b15f9229ba73e5bfff41e0c13ed"} Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.895741 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.898723 4790 scope.go:117] "RemoveContainer" containerID="a8891038882e88af0702659321fde381a785634e4a17975de8d9af4797337040" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.909009 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.922129 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:55 crc kubenswrapper[4790]: E0313 20:51:55.922941 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.922965 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" Mar 13 20:51:55 crc kubenswrapper[4790]: E0313 20:51:55.923002 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="setup-container" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.923009 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="setup-container" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.923182 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" containerName="rabbitmq" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.924131 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931439 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931677 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931849 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.931928 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-6fg95" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.932260 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.932650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.932826 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 13 20:51:55 crc kubenswrapper[4790]: I0313 20:51:55.951258 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.105872 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106203 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106236 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106308 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106626 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106707 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.106995 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.107055 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p45h4\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-kube-api-access-p45h4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.107115 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.208864 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.208950 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209002 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209068 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209097 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p45h4\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-kube-api-access-p45h4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209247 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209274 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209294 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.209671 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.211478 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.212158 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.212496 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.212823 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.213331 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.219359 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.219504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.220062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.221882 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.230789 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p45h4\" (UniqueName: \"kubernetes.io/projected/4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9-kube-api-access-p45h4\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.241165 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9\") " pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.250840 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.337618 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.340164 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.342845 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.354312 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.517653 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.517986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518008 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518061 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518088 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.518151 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619300 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619349 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619413 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619508 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619536 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.619645 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620788 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620863 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.620845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.621508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.621751 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.645193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"dnsmasq-dns-79bd4cc8c9-gwl49\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.718513 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.791887 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 13 20:51:56 crc kubenswrapper[4790]: W0313 20:51:56.791990 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ac7c2bb_fa6a_437a_9af3_d4ffa930bdf9.slice/crio-5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463 WatchSource:0}: Error finding container 5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463: Status 404 returned error can't find the container with id 5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463 Mar 13 20:51:56 crc kubenswrapper[4790]: I0313 20:51:56.892390 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerStarted","Data":"5ca9b139f82dc48918b214c007c4d393720a6ef7e8e7572afe6f0afc1d963463"} Mar 13 20:51:57 crc kubenswrapper[4790]: W0313 20:51:57.192208 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1f7ae1d_5633_4005_8813_533cadffdf5f.slice/crio-32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc WatchSource:0}: Error finding container 32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc: Status 404 returned error can't find the container with id 32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.193605 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.671424 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c575f482-56cd-4dfc-84c6-c6bb922d56a9" path="/var/lib/kubelet/pods/c575f482-56cd-4dfc-84c6-c6bb922d56a9/volumes" Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.905081 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" exitCode=0 Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.905175 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerDied","Data":"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91"} Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.905233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerStarted","Data":"32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc"} Mar 13 20:51:57 crc kubenswrapper[4790]: I0313 20:51:57.907161 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerStarted","Data":"26817f017f043cd97911724e5d41f909397e98b30bfd97efcb9244f2cb38d580"} Mar 13 20:51:58 crc kubenswrapper[4790]: I0313 20:51:58.918534 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerStarted","Data":"6cc57bdf5a38660fb3604a1deab2679244c408dc5a05666a57500776843ad98d"} Mar 13 20:51:58 crc kubenswrapper[4790]: I0313 20:51:58.921350 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerStarted","Data":"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc"} Mar 13 20:51:58 crc kubenswrapper[4790]: I0313 20:51:58.987543 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" podStartSLOduration=2.987494076 podStartE2EDuration="2.987494076s" podCreationTimestamp="2026-03-13 20:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:51:58.973826334 +0000 UTC m=+1449.994942235" watchObservedRunningTime="2026-03-13 20:51:58.987494076 +0000 UTC m=+1450.008609967" Mar 13 20:51:59 crc kubenswrapper[4790]: I0313 20:51:59.947575 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.140507 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.143182 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.145764 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.147687 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.147863 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.151243 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.199919 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"auto-csr-approver-29557252-mfnmk\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.302189 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"auto-csr-approver-29557252-mfnmk\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.323337 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"auto-csr-approver-29557252-mfnmk\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.471457 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.899608 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:52:00 crc kubenswrapper[4790]: I0313 20:52:00.955664 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" event={"ID":"b77751d8-7e07-4d67-9bed-3858cbfc5c3f","Type":"ContainerStarted","Data":"cbc02170592008372a96345d06138ff81a99dceb83db04c5eb0d1033f77737c4"} Mar 13 20:52:02 crc kubenswrapper[4790]: I0313 20:52:02.976669 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" event={"ID":"b77751d8-7e07-4d67-9bed-3858cbfc5c3f","Type":"ContainerDied","Data":"b5ea61f802c1b094e15351a6cc95042eca8f16ab2272c8f7af336afbb299a8d5"} Mar 13 20:52:02 crc kubenswrapper[4790]: I0313 20:52:02.976514 4790 generic.go:334] "Generic (PLEG): container finished" podID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerID="b5ea61f802c1b094e15351a6cc95042eca8f16ab2272c8f7af336afbb299a8d5" exitCode=0 Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.322130 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.379334 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") pod \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\" (UID: \"b77751d8-7e07-4d67-9bed-3858cbfc5c3f\") " Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.386223 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt" (OuterVolumeSpecName: "kube-api-access-jxqqt") pod "b77751d8-7e07-4d67-9bed-3858cbfc5c3f" (UID: "b77751d8-7e07-4d67-9bed-3858cbfc5c3f"). InnerVolumeSpecName "kube-api-access-jxqqt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.481612 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxqqt\" (UniqueName: \"kubernetes.io/projected/b77751d8-7e07-4d67-9bed-3858cbfc5c3f-kube-api-access-jxqqt\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.997215 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" event={"ID":"b77751d8-7e07-4d67-9bed-3858cbfc5c3f","Type":"ContainerDied","Data":"cbc02170592008372a96345d06138ff81a99dceb83db04c5eb0d1033f77737c4"} Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.997583 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbc02170592008372a96345d06138ff81a99dceb83db04c5eb0d1033f77737c4" Mar 13 20:52:04 crc kubenswrapper[4790]: I0313 20:52:04.997252 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557252-mfnmk" Mar 13 20:52:05 crc kubenswrapper[4790]: I0313 20:52:05.392488 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:52:05 crc kubenswrapper[4790]: I0313 20:52:05.400145 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557246-lrvrv"] Mar 13 20:52:05 crc kubenswrapper[4790]: I0313 20:52:05.670490 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97e8561a-a685-44f0-986c-1559e5818ba8" path="/var/lib/kubelet/pods/97e8561a-a685-44f0-986c-1559e5818ba8/volumes" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.720421 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.777020 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.777600 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" containerID="cri-o://4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29" gracePeriod=10 Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.969437 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p5ml2"] Mar 13 20:52:06 crc kubenswrapper[4790]: E0313 20:52:06.969798 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerName="oc" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.969814 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerName="oc" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.970013 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" containerName="oc" Mar 13 20:52:06 crc kubenswrapper[4790]: I0313 20:52:06.971145 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.000966 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p5ml2"] Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.038440 4790 generic.go:334] "Generic (PLEG): container finished" podID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerID="4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29" exitCode=0 Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.038485 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerDied","Data":"4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29"} Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.041758 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-svc\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.041884 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.041925 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042140 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042289 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042421 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq9d8\" (UniqueName: \"kubernetes.io/projected/66175627-2b03-49c6-a7a1-de69f8851d9a-kube-api-access-hq9d8\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.042529 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-config\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144879 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq9d8\" (UniqueName: \"kubernetes.io/projected/66175627-2b03-49c6-a7a1-de69f8851d9a-kube-api-access-hq9d8\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144918 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-config\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.144997 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-svc\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.145093 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.145142 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.145168 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.146818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-sb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.146999 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-swift-storage-0\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.147110 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-openstack-edpm-ipam\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.147827 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-ovsdbserver-nb\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.148307 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-dns-svc\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.148853 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/66175627-2b03-49c6-a7a1-de69f8851d9a-config\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.167300 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq9d8\" (UniqueName: \"kubernetes.io/projected/66175627-2b03-49c6-a7a1-de69f8851d9a-kube-api-access-hq9d8\") pod \"dnsmasq-dns-55478c4467-p5ml2\" (UID: \"66175627-2b03-49c6-a7a1-de69f8851d9a\") " pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.289033 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.389633 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454283 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454438 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454468 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454570 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454632 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.454677 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.458997 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw" (OuterVolumeSpecName: "kube-api-access-rz7vw") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "kube-api-access-rz7vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.521146 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.525503 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config" (OuterVolumeSpecName: "config") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.532598 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: E0313 20:52:07.540677 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc podName:03ea3d76-1bca-44e8-986c-8e751336f93d nodeName:}" failed. No retries permitted until 2026-03-13 20:52:08.040652589 +0000 UTC m=+1459.061768480 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "dns-svc" (UniqueName: "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d") : error deleting /var/lib/kubelet/pods/03ea3d76-1bca-44e8-986c-8e751336f93d/volume-subpaths: remove /var/lib/kubelet/pods/03ea3d76-1bca-44e8-986c-8e751336f93d/volume-subpaths: no such file or directory Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.540954 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557049 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557097 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557113 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557126 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.557137 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rz7vw\" (UniqueName: \"kubernetes.io/projected/03ea3d76-1bca-44e8-986c-8e751336f93d-kube-api-access-rz7vw\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:07 crc kubenswrapper[4790]: I0313 20:52:07.736064 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-55478c4467-p5ml2"] Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.048276 4790 generic.go:334] "Generic (PLEG): container finished" podID="66175627-2b03-49c6-a7a1-de69f8851d9a" containerID="27871394f5ffcdc35185d80b9e0ce4c575067fe2fa1efdd455ca8a9c92d8ff49" exitCode=0 Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.048338 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" event={"ID":"66175627-2b03-49c6-a7a1-de69f8851d9a","Type":"ContainerDied","Data":"27871394f5ffcdc35185d80b9e0ce4c575067fe2fa1efdd455ca8a9c92d8ff49"} Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.048364 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" event={"ID":"66175627-2b03-49c6-a7a1-de69f8851d9a","Type":"ContainerStarted","Data":"6927ac38b8d0f8f8ffc931229c562675d8587b274b5beeb240669881cdf429f0"} Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.050333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" event={"ID":"03ea3d76-1bca-44e8-986c-8e751336f93d","Type":"ContainerDied","Data":"b70c45deb3c4d259bee0d048396a3a151ac96b6eb37804970ed797de5f96a100"} Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.050408 4790 scope.go:117] "RemoveContainer" containerID="4bd79e27621e3d3d4bce68941d1a486f8bc96266be819067b6ade98b7e023e29" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.050403 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-89c5cd4d5-kqzmj" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.065716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") pod \"03ea3d76-1bca-44e8-986c-8e751336f93d\" (UID: \"03ea3d76-1bca-44e8-986c-8e751336f93d\") " Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.068314 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "03ea3d76-1bca-44e8-986c-8e751336f93d" (UID: "03ea3d76-1bca-44e8-986c-8e751336f93d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.167777 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/03ea3d76-1bca-44e8-986c-8e751336f93d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.225118 4790 scope.go:117] "RemoveContainer" containerID="3bf4c1a3a8959712b6bdc6bb2a33893090891a1211e7646c25c1b2fcadfa4181" Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.384180 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:52:08 crc kubenswrapper[4790]: I0313 20:52:08.394490 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-89c5cd4d5-kqzmj"] Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.061688 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" event={"ID":"66175627-2b03-49c6-a7a1-de69f8851d9a","Type":"ContainerStarted","Data":"82ab853eb5a738c8ea6d020bf49ff5de2f7ce41d17db3f977cc89e0d8624b3de"} Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.062710 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.086388 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" podStartSLOduration=3.086352553 podStartE2EDuration="3.086352553s" podCreationTimestamp="2026-03-13 20:52:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:52:09.084907404 +0000 UTC m=+1460.106023305" watchObservedRunningTime="2026-03-13 20:52:09.086352553 +0000 UTC m=+1460.107468444" Mar 13 20:52:09 crc kubenswrapper[4790]: I0313 20:52:09.697423 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" path="/var/lib/kubelet/pods/03ea3d76-1bca-44e8-986c-8e751336f93d/volumes" Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.290977 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-55478c4467-p5ml2" Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.355745 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.356103 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" containerID="cri-o://038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" gracePeriod=10 Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.829966 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997625 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997681 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997779 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997868 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997940 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.997976 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:17 crc kubenswrapper[4790]: I0313 20:52:17.998103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") pod \"f1f7ae1d-5633-4005-8813-533cadffdf5f\" (UID: \"f1f7ae1d-5633-4005-8813-533cadffdf5f\") " Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.008469 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8" (OuterVolumeSpecName: "kube-api-access-9j4h8") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "kube-api-access-9j4h8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.054202 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.060464 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config" (OuterVolumeSpecName: "config") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.063406 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.066025 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.066501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.073947 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f1f7ae1d-5633-4005-8813-533cadffdf5f" (UID: "f1f7ae1d-5633-4005-8813-533cadffdf5f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100247 4790 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100291 4790 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100306 4790 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-config\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100318 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9j4h8\" (UniqueName: \"kubernetes.io/projected/f1f7ae1d-5633-4005-8813-533cadffdf5f-kube-api-access-9j4h8\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100329 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100341 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.100351 4790 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f1f7ae1d-5633-4005-8813-533cadffdf5f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147527 4790 generic.go:334] "Generic (PLEG): container finished" podID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" exitCode=0 Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147574 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerDied","Data":"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc"} Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147614 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" event={"ID":"f1f7ae1d-5633-4005-8813-533cadffdf5f","Type":"ContainerDied","Data":"32be04d7937ad9af79c18417e3fc9bc5c26ab317292137d21354d38bf769ecdc"} Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147636 4790 scope.go:117] "RemoveContainer" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.147964 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79bd4cc8c9-gwl49" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.185503 4790 scope.go:117] "RemoveContainer" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.188684 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.198351 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79bd4cc8c9-gwl49"] Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.222937 4790 scope.go:117] "RemoveContainer" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" Mar 13 20:52:18 crc kubenswrapper[4790]: E0313 20:52:18.223940 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc\": container with ID starting with 038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc not found: ID does not exist" containerID="038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.224022 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc"} err="failed to get container status \"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc\": rpc error: code = NotFound desc = could not find container \"038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc\": container with ID starting with 038efb381a154d3160e1dc28e96308ddaf7338a7ddf3d1cc01735e644feb1bfc not found: ID does not exist" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.224054 4790 scope.go:117] "RemoveContainer" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" Mar 13 20:52:18 crc kubenswrapper[4790]: E0313 20:52:18.224367 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91\": container with ID starting with 9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91 not found: ID does not exist" containerID="9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91" Mar 13 20:52:18 crc kubenswrapper[4790]: I0313 20:52:18.224435 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91"} err="failed to get container status \"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91\": rpc error: code = NotFound desc = could not find container \"9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91\": container with ID starting with 9d326b0055fe1b60ac8b2237b02db615dcfa6ce3d1b10696331136dd47dd8e91 not found: ID does not exist" Mar 13 20:52:19 crc kubenswrapper[4790]: I0313 20:52:19.673803 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" path="/var/lib/kubelet/pods/f1f7ae1d-5633-4005-8813-533cadffdf5f/volumes" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.284037 4790 generic.go:334] "Generic (PLEG): container finished" podID="72ed8a4f-a46a-4e41-9335-f10dc6338627" containerID="26817f017f043cd97911724e5d41f909397e98b30bfd97efcb9244f2cb38d580" exitCode=0 Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.284126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerDied","Data":"26817f017f043cd97911724e5d41f909397e98b30bfd97efcb9244f2cb38d580"} Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378450 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h"] Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378895 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378912 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378938 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378945 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="init" Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378966 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378974 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: E0313 20:52:30.378986 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.378991 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.379158 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="03ea3d76-1bca-44e8-986c-8e751336f93d" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.379175 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f1f7ae1d-5633-4005-8813-533cadffdf5f" containerName="dnsmasq-dns" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.379860 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.385327 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.385492 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.385332 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.387840 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.404914 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h"] Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.541807 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.542157 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.542442 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.542752 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645114 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645312 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645410 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.645455 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.649762 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.650033 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.651268 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.664100 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:30 crc kubenswrapper[4790]: I0313 20:52:30.779263 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.297561 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9" containerID="6cc57bdf5a38660fb3604a1deab2679244c408dc5a05666a57500776843ad98d" exitCode=0 Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.297633 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerDied","Data":"6cc57bdf5a38660fb3604a1deab2679244c408dc5a05666a57500776843ad98d"} Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.301748 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"72ed8a4f-a46a-4e41-9335-f10dc6338627","Type":"ContainerStarted","Data":"944ee40aa4f68b4e17b6cc6b58efac933bc0b5da588542fe5893c11c3a99ebe6"} Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.302274 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.333260 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h"] Mar 13 20:52:31 crc kubenswrapper[4790]: I0313 20:52:31.359790 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=37.359766625 podStartE2EDuration="37.359766625s" podCreationTimestamp="2026-03-13 20:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:52:31.346473693 +0000 UTC m=+1482.367589594" watchObservedRunningTime="2026-03-13 20:52:31.359766625 +0000 UTC m=+1482.380882516" Mar 13 20:52:32 crc kubenswrapper[4790]: I0313 20:52:32.311834 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerStarted","Data":"011eaa7798518916fd1cd36b7162e72c8aa57ffd69ac6ec084b6add865ff6d11"} Mar 13 20:52:32 crc kubenswrapper[4790]: I0313 20:52:32.315496 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9","Type":"ContainerStarted","Data":"6af6fba67a214b440e989118f8ced7b17ad8d38fcc1fe03265f7ac7a6dce9d17"} Mar 13 20:52:32 crc kubenswrapper[4790]: I0313 20:52:32.315758 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.448163 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=44.448138399 podStartE2EDuration="44.448138399s" podCreationTimestamp="2026-03-13 20:51:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 20:52:32.342869749 +0000 UTC m=+1483.363985650" watchObservedRunningTime="2026-03-13 20:52:39.448138399 +0000 UTC m=+1490.469254290" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.456078 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.458101 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.471181 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.526931 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.527102 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.527135 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.629264 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.629767 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.629807 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.630528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.631231 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.658524 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"redhat-operators-xck9d\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:39 crc kubenswrapper[4790]: I0313 20:52:39.789182 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:41 crc kubenswrapper[4790]: I0313 20:52:41.732337 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.412674 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerStarted","Data":"68cbadfb5fc5eaf6da98e8021273feb1407956bc4335ba31888729ab3136c705"} Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.416951 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" exitCode=0 Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.416994 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e"} Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.417021 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerStarted","Data":"a7c04727a428df15d4ea2c9336e79e6ee57e874b67d1a38f59d2d7e8b47dd15c"} Mar 13 20:52:42 crc kubenswrapper[4790]: I0313 20:52:42.431007 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" podStartSLOduration=2.387143774 podStartE2EDuration="12.430983873s" podCreationTimestamp="2026-03-13 20:52:30 +0000 UTC" firstStartedPulling="2026-03-13 20:52:31.32207379 +0000 UTC m=+1482.343189691" lastFinishedPulling="2026-03-13 20:52:41.365913899 +0000 UTC m=+1492.387029790" observedRunningTime="2026-03-13 20:52:42.429630617 +0000 UTC m=+1493.450746508" watchObservedRunningTime="2026-03-13 20:52:42.430983873 +0000 UTC m=+1493.452099764" Mar 13 20:52:44 crc kubenswrapper[4790]: I0313 20:52:44.439541 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerStarted","Data":"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf"} Mar 13 20:52:45 crc kubenswrapper[4790]: I0313 20:52:45.313623 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 13 20:52:45 crc kubenswrapper[4790]: I0313 20:52:45.454577 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" exitCode=0 Mar 13 20:52:45 crc kubenswrapper[4790]: I0313 20:52:45.454668 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf"} Mar 13 20:52:46 crc kubenswrapper[4790]: I0313 20:52:46.254593 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 13 20:52:46 crc kubenswrapper[4790]: I0313 20:52:46.468402 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerStarted","Data":"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819"} Mar 13 20:52:46 crc kubenswrapper[4790]: I0313 20:52:46.492713 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xck9d" podStartSLOduration=4.032197105 podStartE2EDuration="7.492694107s" podCreationTimestamp="2026-03-13 20:52:39 +0000 UTC" firstStartedPulling="2026-03-13 20:52:42.418460603 +0000 UTC m=+1493.439576494" lastFinishedPulling="2026-03-13 20:52:45.878957605 +0000 UTC m=+1496.900073496" observedRunningTime="2026-03-13 20:52:46.487321051 +0000 UTC m=+1497.508436942" watchObservedRunningTime="2026-03-13 20:52:46.492694107 +0000 UTC m=+1497.513809998" Mar 13 20:52:49 crc kubenswrapper[4790]: I0313 20:52:49.790305 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:49 crc kubenswrapper[4790]: I0313 20:52:49.790861 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:50 crc kubenswrapper[4790]: I0313 20:52:50.833522 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xck9d" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" probeResult="failure" output=< Mar 13 20:52:50 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 20:52:50 crc kubenswrapper[4790]: > Mar 13 20:52:52 crc kubenswrapper[4790]: I0313 20:52:52.985722 4790 scope.go:117] "RemoveContainer" containerID="a76e1c0d1beff75ffaa42ee8715fd9733a320b575bcb2a1602abbb7840ddf694" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.019640 4790 scope.go:117] "RemoveContainer" containerID="7c5a942da36087bdc3e181e8806caccf07be11d3c05fd4b5b28443007ca270c8" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.053605 4790 scope.go:117] "RemoveContainer" containerID="a469cae8d28a17763807dc70d5fbc5f435ef49995e55c306927cfc053eea835d" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.073599 4790 scope.go:117] "RemoveContainer" containerID="08d59d9ecbc8376b9de39bc3a93a8ca2a0b84d09598e5daa63ce7fe053fdaadf" Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.541855 4790 generic.go:334] "Generic (PLEG): container finished" podID="37459d15-1599-492b-8710-7723829a096d" containerID="68cbadfb5fc5eaf6da98e8021273feb1407956bc4335ba31888729ab3136c705" exitCode=0 Mar 13 20:52:53 crc kubenswrapper[4790]: I0313 20:52:53.542202 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerDied","Data":"68cbadfb5fc5eaf6da98e8021273feb1407956bc4335ba31888729ab3136c705"} Mar 13 20:52:54 crc kubenswrapper[4790]: I0313 20:52:54.993271 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.033897 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.034044 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.034187 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.034271 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") pod \"37459d15-1599-492b-8710-7723829a096d\" (UID: \"37459d15-1599-492b-8710-7723829a096d\") " Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.041164 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.044610 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn" (OuterVolumeSpecName: "kube-api-access-56jjn") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "kube-api-access-56jjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.061649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.066833 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory" (OuterVolumeSpecName: "inventory") pod "37459d15-1599-492b-8710-7723829a096d" (UID: "37459d15-1599-492b-8710-7723829a096d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138176 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138470 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-56jjn\" (UniqueName: \"kubernetes.io/projected/37459d15-1599-492b-8710-7723829a096d-kube-api-access-56jjn\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138557 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.138670 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/37459d15-1599-492b-8710-7723829a096d-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.561578 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" event={"ID":"37459d15-1599-492b-8710-7723829a096d","Type":"ContainerDied","Data":"011eaa7798518916fd1cd36b7162e72c8aa57ffd69ac6ec084b6add865ff6d11"} Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.561627 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="011eaa7798518916fd1cd36b7162e72c8aa57ffd69ac6ec084b6add865ff6d11" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.561648 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.647077 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2"] Mar 13 20:52:55 crc kubenswrapper[4790]: E0313 20:52:55.647595 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37459d15-1599-492b-8710-7723829a096d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.647621 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="37459d15-1599-492b-8710-7723829a096d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.647865 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="37459d15-1599-492b-8710-7723829a096d" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.648694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.651904 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.652478 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.652849 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.654292 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.657568 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2"] Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.750011 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.750174 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.750270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.852311 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.852441 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.852523 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.856336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.865483 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.871064 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zgxl2\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:55 crc kubenswrapper[4790]: I0313 20:52:55.972916 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:52:56 crc kubenswrapper[4790]: I0313 20:52:56.490622 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2"] Mar 13 20:52:56 crc kubenswrapper[4790]: I0313 20:52:56.572783 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerStarted","Data":"23954afae29dc0d42d7aa797a383ebbedc7c5ae34a41912117cbf97794e9592c"} Mar 13 20:52:57 crc kubenswrapper[4790]: I0313 20:52:57.583702 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerStarted","Data":"e536e5bc016726000a9f433d8e01bfad7c9bcef53ef7691e5f152680b0727e30"} Mar 13 20:52:57 crc kubenswrapper[4790]: I0313 20:52:57.615998 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" podStartSLOduration=2.201365979 podStartE2EDuration="2.615977662s" podCreationTimestamp="2026-03-13 20:52:55 +0000 UTC" firstStartedPulling="2026-03-13 20:52:56.496725773 +0000 UTC m=+1507.517841664" lastFinishedPulling="2026-03-13 20:52:56.911337456 +0000 UTC m=+1507.932453347" observedRunningTime="2026-03-13 20:52:57.609953748 +0000 UTC m=+1508.631069639" watchObservedRunningTime="2026-03-13 20:52:57.615977662 +0000 UTC m=+1508.637093553" Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.603533 4790 generic.go:334] "Generic (PLEG): container finished" podID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerID="e536e5bc016726000a9f433d8e01bfad7c9bcef53ef7691e5f152680b0727e30" exitCode=0 Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.603645 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerDied","Data":"e536e5bc016726000a9f433d8e01bfad7c9bcef53ef7691e5f152680b0727e30"} Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.837859 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:52:59 crc kubenswrapper[4790]: I0313 20:52:59.890785 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:53:00 crc kubenswrapper[4790]: I0313 20:53:00.071368 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:53:00 crc kubenswrapper[4790]: I0313 20:53:00.994984 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.062136 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") pod \"6383acac-fad0-45d2-8263-da2ceb0b9e83\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.062231 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") pod \"6383acac-fad0-45d2-8263-da2ceb0b9e83\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.062310 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") pod \"6383acac-fad0-45d2-8263-da2ceb0b9e83\" (UID: \"6383acac-fad0-45d2-8263-da2ceb0b9e83\") " Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.069233 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd" (OuterVolumeSpecName: "kube-api-access-pknwd") pod "6383acac-fad0-45d2-8263-da2ceb0b9e83" (UID: "6383acac-fad0-45d2-8263-da2ceb0b9e83"). InnerVolumeSpecName "kube-api-access-pknwd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.093501 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory" (OuterVolumeSpecName: "inventory") pod "6383acac-fad0-45d2-8263-da2ceb0b9e83" (UID: "6383acac-fad0-45d2-8263-da2ceb0b9e83"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.102589 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6383acac-fad0-45d2-8263-da2ceb0b9e83" (UID: "6383acac-fad0-45d2-8263-da2ceb0b9e83"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.165683 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.166001 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6383acac-fad0-45d2-8263-da2ceb0b9e83-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.166137 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pknwd\" (UniqueName: \"kubernetes.io/projected/6383acac-fad0-45d2-8263-da2ceb0b9e83-kube-api-access-pknwd\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623250 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" event={"ID":"6383acac-fad0-45d2-8263-da2ceb0b9e83","Type":"ContainerDied","Data":"23954afae29dc0d42d7aa797a383ebbedc7c5ae34a41912117cbf97794e9592c"} Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623317 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23954afae29dc0d42d7aa797a383ebbedc7c5ae34a41912117cbf97794e9592c" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623263 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zgxl2" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.623706 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xck9d" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" containerID="cri-o://c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" gracePeriod=2 Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.698442 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n"] Mar 13 20:53:01 crc kubenswrapper[4790]: E0313 20:53:01.698999 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.699090 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.699351 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6383acac-fad0-45d2-8263-da2ceb0b9e83" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.699977 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.705534 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.705907 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.706075 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.706253 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.709950 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n"] Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779137 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779206 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779271 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.779315 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881146 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881271 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.881307 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.886238 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.889092 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.893303 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:01 crc kubenswrapper[4790]: I0313 20:53:01.900274 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.069885 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.118428 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.189023 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") pod \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.189350 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") pod \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.189425 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") pod \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\" (UID: \"8e61b37d-27a6-44dc-83c2-1aa0b9850465\") " Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.190130 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities" (OuterVolumeSpecName: "utilities") pod "8e61b37d-27a6-44dc-83c2-1aa0b9850465" (UID: "8e61b37d-27a6-44dc-83c2-1aa0b9850465"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.196144 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz" (OuterVolumeSpecName: "kube-api-access-j5xmz") pod "8e61b37d-27a6-44dc-83c2-1aa0b9850465" (UID: "8e61b37d-27a6-44dc-83c2-1aa0b9850465"). InnerVolumeSpecName "kube-api-access-j5xmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.291982 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.292026 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5xmz\" (UniqueName: \"kubernetes.io/projected/8e61b37d-27a6-44dc-83c2-1aa0b9850465-kube-api-access-j5xmz\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.360566 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e61b37d-27a6-44dc-83c2-1aa0b9850465" (UID: "8e61b37d-27a6-44dc-83c2-1aa0b9850465"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.394084 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e61b37d-27a6-44dc-83c2-1aa0b9850465-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.629643 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n"] Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634543 4790 generic.go:334] "Generic (PLEG): container finished" podID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" exitCode=0 Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634604 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xck9d" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634624 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819"} Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634684 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xck9d" event={"ID":"8e61b37d-27a6-44dc-83c2-1aa0b9850465","Type":"ContainerDied","Data":"a7c04727a428df15d4ea2c9336e79e6ee57e874b67d1a38f59d2d7e8b47dd15c"} Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.634714 4790 scope.go:117] "RemoveContainer" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.659970 4790 scope.go:117] "RemoveContainer" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.674160 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.687627 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xck9d"] Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.690810 4790 scope.go:117] "RemoveContainer" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.707346 4790 scope.go:117] "RemoveContainer" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" Mar 13 20:53:02 crc kubenswrapper[4790]: E0313 20:53:02.708025 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819\": container with ID starting with c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819 not found: ID does not exist" containerID="c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708073 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819"} err="failed to get container status \"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819\": rpc error: code = NotFound desc = could not find container \"c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819\": container with ID starting with c6fbc7cd7ec379a21e08e3cecc73345b66f22f76742636abc22f3a0b62190819 not found: ID does not exist" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708105 4790 scope.go:117] "RemoveContainer" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" Mar 13 20:53:02 crc kubenswrapper[4790]: E0313 20:53:02.708670 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf\": container with ID starting with 555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf not found: ID does not exist" containerID="555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708723 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf"} err="failed to get container status \"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf\": rpc error: code = NotFound desc = could not find container \"555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf\": container with ID starting with 555cf133b8b9484a473d0bfbbd0ba97838cfa7dcd8770999e3cc05d719c43baf not found: ID does not exist" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.708754 4790 scope.go:117] "RemoveContainer" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" Mar 13 20:53:02 crc kubenswrapper[4790]: E0313 20:53:02.709087 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e\": container with ID starting with 9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e not found: ID does not exist" containerID="9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e" Mar 13 20:53:02 crc kubenswrapper[4790]: I0313 20:53:02.709127 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e"} err="failed to get container status \"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e\": rpc error: code = NotFound desc = could not find container \"9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e\": container with ID starting with 9b31f907153625e03b453ccf0cbfa00742e1266b409c07a40f62754b64b0c28e not found: ID does not exist" Mar 13 20:53:03 crc kubenswrapper[4790]: I0313 20:53:03.645978 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerStarted","Data":"ef1ad01ff1610150e75c805dfbe677ad94c23d1f578c4b9bb8893fd71bbdb07d"} Mar 13 20:53:03 crc kubenswrapper[4790]: I0313 20:53:03.646327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerStarted","Data":"8a8a4b31d38642270b5c6ca8e8476670fc95c963faff03b5219523182e59cc45"} Mar 13 20:53:03 crc kubenswrapper[4790]: I0313 20:53:03.681191 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" path="/var/lib/kubelet/pods/8e61b37d-27a6-44dc-83c2-1aa0b9850465/volumes" Mar 13 20:53:44 crc kubenswrapper[4790]: I0313 20:53:44.015528 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:53:44 crc kubenswrapper[4790]: I0313 20:53:44.016036 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:53:53 crc kubenswrapper[4790]: I0313 20:53:53.250604 4790 scope.go:117] "RemoveContainer" containerID="be394eadb1a12955ac79ebd44714ea2fd283def65154fb6c18e14cac83eb1a07" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.162001 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" podStartSLOduration=58.72791871 podStartE2EDuration="59.161983362s" podCreationTimestamp="2026-03-13 20:53:01 +0000 UTC" firstStartedPulling="2026-03-13 20:53:02.633822876 +0000 UTC m=+1513.654938767" lastFinishedPulling="2026-03-13 20:53:03.067887528 +0000 UTC m=+1514.089003419" observedRunningTime="2026-03-13 20:53:03.668169824 +0000 UTC m=+1514.689285735" watchObservedRunningTime="2026-03-13 20:54:00.161983362 +0000 UTC m=+1571.183099273" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.172839 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 20:54:00 crc kubenswrapper[4790]: E0313 20:54:00.173349 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-content" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173390 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-content" Mar 13 20:54:00 crc kubenswrapper[4790]: E0313 20:54:00.173412 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173421 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" Mar 13 20:54:00 crc kubenswrapper[4790]: E0313 20:54:00.173462 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-utilities" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173471 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="extract-utilities" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.173717 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e61b37d-27a6-44dc-83c2-1aa0b9850465" containerName="registry-server" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.174505 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.180207 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.180708 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.181608 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.186892 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.196269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"auto-csr-approver-29557254-62lxw\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.297103 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"auto-csr-approver-29557254-62lxw\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.316820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"auto-csr-approver-29557254-62lxw\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.494272 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.953659 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 20:54:00 crc kubenswrapper[4790]: I0313 20:54:00.954214 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:54:01 crc kubenswrapper[4790]: I0313 20:54:01.149562 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557254-62lxw" event={"ID":"173eb1b0-728a-4420-bfab-ba33ae08f5eb","Type":"ContainerStarted","Data":"95504a8a944ca2a2025a00a28745f457433239f1a60e421ea50cc57ac0d77836"} Mar 13 20:54:03 crc kubenswrapper[4790]: I0313 20:54:03.181239 4790 generic.go:334] "Generic (PLEG): container finished" podID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerID="1354228427a90e6609d9b0170fc1b61342fc6ff24449709c9abd0f642ea90a66" exitCode=0 Mar 13 20:54:03 crc kubenswrapper[4790]: I0313 20:54:03.181311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557254-62lxw" event={"ID":"173eb1b0-728a-4420-bfab-ba33ae08f5eb","Type":"ContainerDied","Data":"1354228427a90e6609d9b0170fc1b61342fc6ff24449709c9abd0f642ea90a66"} Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.483176 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.679197 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") pod \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\" (UID: \"173eb1b0-728a-4420-bfab-ba33ae08f5eb\") " Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.687220 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s" (OuterVolumeSpecName: "kube-api-access-zwk9s") pod "173eb1b0-728a-4420-bfab-ba33ae08f5eb" (UID: "173eb1b0-728a-4420-bfab-ba33ae08f5eb"). InnerVolumeSpecName "kube-api-access-zwk9s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:54:04 crc kubenswrapper[4790]: I0313 20:54:04.782411 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zwk9s\" (UniqueName: \"kubernetes.io/projected/173eb1b0-728a-4420-bfab-ba33ae08f5eb-kube-api-access-zwk9s\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.201253 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557254-62lxw" event={"ID":"173eb1b0-728a-4420-bfab-ba33ae08f5eb","Type":"ContainerDied","Data":"95504a8a944ca2a2025a00a28745f457433239f1a60e421ea50cc57ac0d77836"} Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.201621 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95504a8a944ca2a2025a00a28745f457433239f1a60e421ea50cc57ac0d77836" Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.201344 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557254-62lxw" Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.548977 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.564152 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557248-msw96"] Mar 13 20:54:05 crc kubenswrapper[4790]: I0313 20:54:05.669772 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eda4da8c-f54a-4c25-9669-ff180aa0b9a9" path="/var/lib/kubelet/pods/eda4da8c-f54a-4c25-9669-ff180aa0b9a9/volumes" Mar 13 20:54:14 crc kubenswrapper[4790]: I0313 20:54:14.016131 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:54:14 crc kubenswrapper[4790]: I0313 20:54:14.017548 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.929500 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:36 crc kubenswrapper[4790]: E0313 20:54:36.930444 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerName="oc" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.930457 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerName="oc" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.930688 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" containerName="oc" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.932018 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:36 crc kubenswrapper[4790]: I0313 20:54:36.945303 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.108584 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.108651 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.108712 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210116 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210181 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210235 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210770 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.210985 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.234159 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"redhat-marketplace-lczcj\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.252556 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:37 crc kubenswrapper[4790]: I0313 20:54:37.724827 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:38 crc kubenswrapper[4790]: I0313 20:54:38.491071 4790 generic.go:334] "Generic (PLEG): container finished" podID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" exitCode=0 Mar 13 20:54:38 crc kubenswrapper[4790]: I0313 20:54:38.491192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50"} Mar 13 20:54:38 crc kubenswrapper[4790]: I0313 20:54:38.491471 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerStarted","Data":"865528cf8a33a2814a57e2c3535b244c3a265fa0760972e7defe25f2fc5fe2d7"} Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.015589 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.016175 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.016229 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.017134 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.017200 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" gracePeriod=600 Mar 13 20:54:44 crc kubenswrapper[4790]: E0313 20:54:44.135978 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.555802 4790 generic.go:334] "Generic (PLEG): container finished" podID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" exitCode=0 Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.556186 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df"} Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.561236 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" exitCode=0 Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.561308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e"} Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.561460 4790 scope.go:117] "RemoveContainer" containerID="7265c148a5840e02c0d05363d253e3b056f233c63bc78d73aa4fcf9dbde019eb" Mar 13 20:54:44 crc kubenswrapper[4790]: I0313 20:54:44.562720 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:54:44 crc kubenswrapper[4790]: E0313 20:54:44.563543 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:54:45 crc kubenswrapper[4790]: I0313 20:54:45.573248 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerStarted","Data":"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8"} Mar 13 20:54:45 crc kubenswrapper[4790]: I0313 20:54:45.595697 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lczcj" podStartSLOduration=3.012902525 podStartE2EDuration="9.595678316s" podCreationTimestamp="2026-03-13 20:54:36 +0000 UTC" firstStartedPulling="2026-03-13 20:54:38.493065916 +0000 UTC m=+1609.514181807" lastFinishedPulling="2026-03-13 20:54:45.075841707 +0000 UTC m=+1616.096957598" observedRunningTime="2026-03-13 20:54:45.591553112 +0000 UTC m=+1616.612669013" watchObservedRunningTime="2026-03-13 20:54:45.595678316 +0000 UTC m=+1616.616794207" Mar 13 20:54:47 crc kubenswrapper[4790]: I0313 20:54:47.252683 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:47 crc kubenswrapper[4790]: I0313 20:54:47.253034 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:47 crc kubenswrapper[4790]: I0313 20:54:47.300985 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:53 crc kubenswrapper[4790]: I0313 20:54:53.351712 4790 scope.go:117] "RemoveContainer" containerID="3e3742b7258e70b94cf2ef846ea4b59ba8175c78c72478006fdab7b609eebe2a" Mar 13 20:54:55 crc kubenswrapper[4790]: I0313 20:54:55.659438 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:54:55 crc kubenswrapper[4790]: E0313 20:54:55.659897 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:54:57 crc kubenswrapper[4790]: I0313 20:54:57.309489 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:57 crc kubenswrapper[4790]: I0313 20:54:57.366885 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:57 crc kubenswrapper[4790]: I0313 20:54:57.689801 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lczcj" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" containerID="cri-o://6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" gracePeriod=2 Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.143907 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.268258 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") pod \"58e8b831-38b3-41f5-b0db-341376a43ee7\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.268369 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") pod \"58e8b831-38b3-41f5-b0db-341376a43ee7\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.268574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") pod \"58e8b831-38b3-41f5-b0db-341376a43ee7\" (UID: \"58e8b831-38b3-41f5-b0db-341376a43ee7\") " Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.269368 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities" (OuterVolumeSpecName: "utilities") pod "58e8b831-38b3-41f5-b0db-341376a43ee7" (UID: "58e8b831-38b3-41f5-b0db-341376a43ee7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.273912 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7" (OuterVolumeSpecName: "kube-api-access-59rk7") pod "58e8b831-38b3-41f5-b0db-341376a43ee7" (UID: "58e8b831-38b3-41f5-b0db-341376a43ee7"). InnerVolumeSpecName "kube-api-access-59rk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.296754 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "58e8b831-38b3-41f5-b0db-341376a43ee7" (UID: "58e8b831-38b3-41f5-b0db-341376a43ee7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.370418 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.370459 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59rk7\" (UniqueName: \"kubernetes.io/projected/58e8b831-38b3-41f5-b0db-341376a43ee7-kube-api-access-59rk7\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.370470 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/58e8b831-38b3-41f5-b0db-341376a43ee7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.698955 4790 generic.go:334] "Generic (PLEG): container finished" podID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" exitCode=0 Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699000 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lczcj" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8"} Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699373 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lczcj" event={"ID":"58e8b831-38b3-41f5-b0db-341376a43ee7","Type":"ContainerDied","Data":"865528cf8a33a2814a57e2c3535b244c3a265fa0760972e7defe25f2fc5fe2d7"} Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.699404 4790 scope.go:117] "RemoveContainer" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.719712 4790 scope.go:117] "RemoveContainer" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.735855 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.744799 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lczcj"] Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.760585 4790 scope.go:117] "RemoveContainer" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.788019 4790 scope.go:117] "RemoveContainer" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" Mar 13 20:54:58 crc kubenswrapper[4790]: E0313 20:54:58.788549 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8\": container with ID starting with 6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8 not found: ID does not exist" containerID="6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.788596 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8"} err="failed to get container status \"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8\": rpc error: code = NotFound desc = could not find container \"6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8\": container with ID starting with 6202db31013bdb2d3ab746d570252cb882bb2629c21068d8f854492b5271bae8 not found: ID does not exist" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.788618 4790 scope.go:117] "RemoveContainer" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" Mar 13 20:54:58 crc kubenswrapper[4790]: E0313 20:54:58.789088 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df\": container with ID starting with 3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df not found: ID does not exist" containerID="3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.789115 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df"} err="failed to get container status \"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df\": rpc error: code = NotFound desc = could not find container \"3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df\": container with ID starting with 3b1aa8217bcea5c47403ae5a3cf749fbe15e7addfd050a1bd5ca97417c3867df not found: ID does not exist" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.789132 4790 scope.go:117] "RemoveContainer" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" Mar 13 20:54:58 crc kubenswrapper[4790]: E0313 20:54:58.795078 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50\": container with ID starting with fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50 not found: ID does not exist" containerID="fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50" Mar 13 20:54:58 crc kubenswrapper[4790]: I0313 20:54:58.795135 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50"} err="failed to get container status \"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50\": rpc error: code = NotFound desc = could not find container \"fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50\": container with ID starting with fffa56def434067a6e0068caab2a2962945497b9d0362445f755d014aa917e50 not found: ID does not exist" Mar 13 20:54:59 crc kubenswrapper[4790]: I0313 20:54:59.673442 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" path="/var/lib/kubelet/pods/58e8b831-38b3-41f5-b0db-341376a43ee7/volumes" Mar 13 20:55:07 crc kubenswrapper[4790]: I0313 20:55:07.659655 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:07 crc kubenswrapper[4790]: E0313 20:55:07.660333 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.179767 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:09 crc kubenswrapper[4790]: E0313 20:55:09.180174 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-content" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180187 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-content" Mar 13 20:55:09 crc kubenswrapper[4790]: E0313 20:55:09.180201 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180209 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" Mar 13 20:55:09 crc kubenswrapper[4790]: E0313 20:55:09.180230 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-utilities" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180239 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="extract-utilities" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.180468 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e8b831-38b3-41f5-b0db-341376a43ee7" containerName="registry-server" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.181760 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.194576 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.294866 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.294927 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.294981 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397020 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397259 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.397800 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.398071 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.420677 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"certified-operators-sfmsk\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:09 crc kubenswrapper[4790]: I0313 20:55:09.500901 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.020711 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.848319 4790 generic.go:334] "Generic (PLEG): container finished" podID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" exitCode=0 Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.848423 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f"} Mar 13 20:55:10 crc kubenswrapper[4790]: I0313 20:55:10.848657 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerStarted","Data":"f2d2a481f1c0ef44a5d063d8a491e0780cdc0c10daa3fa086aeadd318fcf2d52"} Mar 13 20:55:11 crc kubenswrapper[4790]: I0313 20:55:11.860062 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerStarted","Data":"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553"} Mar 13 20:55:12 crc kubenswrapper[4790]: I0313 20:55:12.871864 4790 generic.go:334] "Generic (PLEG): container finished" podID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" exitCode=0 Mar 13 20:55:12 crc kubenswrapper[4790]: I0313 20:55:12.871936 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553"} Mar 13 20:55:13 crc kubenswrapper[4790]: I0313 20:55:13.881838 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerStarted","Data":"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109"} Mar 13 20:55:13 crc kubenswrapper[4790]: I0313 20:55:13.902454 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sfmsk" podStartSLOduration=2.425414831 podStartE2EDuration="4.902435033s" podCreationTimestamp="2026-03-13 20:55:09 +0000 UTC" firstStartedPulling="2026-03-13 20:55:10.850659753 +0000 UTC m=+1641.871775654" lastFinishedPulling="2026-03-13 20:55:13.327679965 +0000 UTC m=+1644.348795856" observedRunningTime="2026-03-13 20:55:13.899338338 +0000 UTC m=+1644.920454229" watchObservedRunningTime="2026-03-13 20:55:13.902435033 +0000 UTC m=+1644.923550934" Mar 13 20:55:19 crc kubenswrapper[4790]: I0313 20:55:19.501753 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:19 crc kubenswrapper[4790]: I0313 20:55:19.502004 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:19 crc kubenswrapper[4790]: I0313 20:55:19.547422 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:20 crc kubenswrapper[4790]: I0313 20:55:20.006310 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:20 crc kubenswrapper[4790]: I0313 20:55:20.051054 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:21 crc kubenswrapper[4790]: I0313 20:55:21.660916 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:21 crc kubenswrapper[4790]: E0313 20:55:21.661680 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:21 crc kubenswrapper[4790]: I0313 20:55:21.959423 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sfmsk" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" containerID="cri-o://3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" gracePeriod=2 Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.403107 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.591267 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") pod \"13b049e8-3316-420e-9ec2-a83f7c645d0d\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.591430 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") pod \"13b049e8-3316-420e-9ec2-a83f7c645d0d\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.592192 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") pod \"13b049e8-3316-420e-9ec2-a83f7c645d0d\" (UID: \"13b049e8-3316-420e-9ec2-a83f7c645d0d\") " Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.593206 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities" (OuterVolumeSpecName: "utilities") pod "13b049e8-3316-420e-9ec2-a83f7c645d0d" (UID: "13b049e8-3316-420e-9ec2-a83f7c645d0d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.597148 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8" (OuterVolumeSpecName: "kube-api-access-s4bp8") pod "13b049e8-3316-420e-9ec2-a83f7c645d0d" (UID: "13b049e8-3316-420e-9ec2-a83f7c645d0d"). InnerVolumeSpecName "kube-api-access-s4bp8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.694945 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.694995 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4bp8\" (UniqueName: \"kubernetes.io/projected/13b049e8-3316-420e-9ec2-a83f7c645d0d-kube-api-access-s4bp8\") on node \"crc\" DevicePath \"\"" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.972863 4790 generic.go:334] "Generic (PLEG): container finished" podID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" exitCode=0 Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.972911 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sfmsk" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.972930 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109"} Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.973317 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sfmsk" event={"ID":"13b049e8-3316-420e-9ec2-a83f7c645d0d","Type":"ContainerDied","Data":"f2d2a481f1c0ef44a5d063d8a491e0780cdc0c10daa3fa086aeadd318fcf2d52"} Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.973336 4790 scope.go:117] "RemoveContainer" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" Mar 13 20:55:22 crc kubenswrapper[4790]: I0313 20:55:22.993682 4790 scope.go:117] "RemoveContainer" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.011838 4790 scope.go:117] "RemoveContainer" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.052977 4790 scope.go:117] "RemoveContainer" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" Mar 13 20:55:23 crc kubenswrapper[4790]: E0313 20:55:23.053353 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109\": container with ID starting with 3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109 not found: ID does not exist" containerID="3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053421 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109"} err="failed to get container status \"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109\": rpc error: code = NotFound desc = could not find container \"3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109\": container with ID starting with 3204874eb62d0bb67071495dd8ff95e6ab0f1a99641a1b730e74fd60b4375109 not found: ID does not exist" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053456 4790 scope.go:117] "RemoveContainer" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" Mar 13 20:55:23 crc kubenswrapper[4790]: E0313 20:55:23.053764 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553\": container with ID starting with 0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553 not found: ID does not exist" containerID="0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053794 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553"} err="failed to get container status \"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553\": rpc error: code = NotFound desc = could not find container \"0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553\": container with ID starting with 0d68cc81b976304ce2e98140c9518e5210f646e32b7b98fc74db223c11322553 not found: ID does not exist" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.053812 4790 scope.go:117] "RemoveContainer" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" Mar 13 20:55:23 crc kubenswrapper[4790]: E0313 20:55:23.054045 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f\": container with ID starting with c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f not found: ID does not exist" containerID="c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.054080 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f"} err="failed to get container status \"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f\": rpc error: code = NotFound desc = could not find container \"c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f\": container with ID starting with c3d9be8124d1105bee607ecc0ad1a5c568aca6c2f3e5185431d764b53b6d3a7f not found: ID does not exist" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.756195 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13b049e8-3316-420e-9ec2-a83f7c645d0d" (UID: "13b049e8-3316-420e-9ec2-a83f7c645d0d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.813123 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13b049e8-3316-420e-9ec2-a83f7c645d0d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.912705 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:23 crc kubenswrapper[4790]: I0313 20:55:23.923395 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sfmsk"] Mar 13 20:55:25 crc kubenswrapper[4790]: I0313 20:55:25.670597 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" path="/var/lib/kubelet/pods/13b049e8-3316-420e-9ec2-a83f7c645d0d/volumes" Mar 13 20:55:32 crc kubenswrapper[4790]: I0313 20:55:32.665629 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:32 crc kubenswrapper[4790]: E0313 20:55:32.666765 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:43 crc kubenswrapper[4790]: I0313 20:55:43.661659 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:43 crc kubenswrapper[4790]: E0313 20:55:43.662334 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.028753 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:55:49 crc kubenswrapper[4790]: E0313 20:55:49.029755 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-content" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.029772 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-content" Mar 13 20:55:49 crc kubenswrapper[4790]: E0313 20:55:49.029794 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-utilities" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.029803 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="extract-utilities" Mar 13 20:55:49 crc kubenswrapper[4790]: E0313 20:55:49.029838 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.029845 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.030055 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="13b049e8-3316-420e-9ec2-a83f7c645d0d" containerName="registry-server" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.031550 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.051384 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.100877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.100957 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.100983 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203184 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203217 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203792 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.203825 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.226308 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"community-operators-vnc4k\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.347342 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:49 crc kubenswrapper[4790]: I0313 20:55:49.931439 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:55:50 crc kubenswrapper[4790]: I0313 20:55:50.230039 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" exitCode=0 Mar 13 20:55:50 crc kubenswrapper[4790]: I0313 20:55:50.230093 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9"} Mar 13 20:55:50 crc kubenswrapper[4790]: I0313 20:55:50.230121 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerStarted","Data":"3fa0411158a515661fc93f23367f71b3fb55a578ca344043d47fb79a8a6b6cd1"} Mar 13 20:55:51 crc kubenswrapper[4790]: I0313 20:55:51.242411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerStarted","Data":"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d"} Mar 13 20:55:52 crc kubenswrapper[4790]: I0313 20:55:52.253734 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" exitCode=0 Mar 13 20:55:52 crc kubenswrapper[4790]: I0313 20:55:52.253779 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d"} Mar 13 20:55:53 crc kubenswrapper[4790]: I0313 20:55:53.264764 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerStarted","Data":"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7"} Mar 13 20:55:53 crc kubenswrapper[4790]: I0313 20:55:53.285664 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vnc4k" podStartSLOduration=1.810891243 podStartE2EDuration="4.285647563s" podCreationTimestamp="2026-03-13 20:55:49 +0000 UTC" firstStartedPulling="2026-03-13 20:55:50.232608568 +0000 UTC m=+1681.253724459" lastFinishedPulling="2026-03-13 20:55:52.707364888 +0000 UTC m=+1683.728480779" observedRunningTime="2026-03-13 20:55:53.280990285 +0000 UTC m=+1684.302106176" watchObservedRunningTime="2026-03-13 20:55:53.285647563 +0000 UTC m=+1684.306763454" Mar 13 20:55:55 crc kubenswrapper[4790]: I0313 20:55:55.660300 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:55:55 crc kubenswrapper[4790]: E0313 20:55:55.660871 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:55:59 crc kubenswrapper[4790]: I0313 20:55:59.348305 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:59 crc kubenswrapper[4790]: I0313 20:55:59.349963 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:55:59 crc kubenswrapper[4790]: I0313 20:55:59.400913 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.148694 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.149869 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.159010 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.159038 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.159438 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.162037 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.224311 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"auto-csr-approver-29557256-v26h5\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.327482 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"auto-csr-approver-29557256-v26h5\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.348475 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"auto-csr-approver-29557256-v26h5\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.389509 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.454757 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.468054 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:00 crc kubenswrapper[4790]: I0313 20:56:00.957208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 20:56:00 crc kubenswrapper[4790]: W0313 20:56:00.958948 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ffc58ad_c12d_4165_bc92_1e948aa14c42.slice/crio-11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4 WatchSource:0}: Error finding container 11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4: Status 404 returned error can't find the container with id 11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4 Mar 13 20:56:01 crc kubenswrapper[4790]: I0313 20:56:01.334910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerStarted","Data":"11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4"} Mar 13 20:56:02 crc kubenswrapper[4790]: I0313 20:56:02.345818 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerStarted","Data":"089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093"} Mar 13 20:56:02 crc kubenswrapper[4790]: I0313 20:56:02.346165 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vnc4k" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" containerID="cri-o://5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" gracePeriod=2 Mar 13 20:56:02 crc kubenswrapper[4790]: I0313 20:56:02.365327 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557256-v26h5" podStartSLOduration=1.549286537 podStartE2EDuration="2.365307644s" podCreationTimestamp="2026-03-13 20:56:00 +0000 UTC" firstStartedPulling="2026-03-13 20:56:00.961066156 +0000 UTC m=+1691.982182047" lastFinishedPulling="2026-03-13 20:56:01.777087263 +0000 UTC m=+1692.798203154" observedRunningTime="2026-03-13 20:56:02.360909283 +0000 UTC m=+1693.382025194" watchObservedRunningTime="2026-03-13 20:56:02.365307644 +0000 UTC m=+1693.386423535" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.291330 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361634 4790 generic.go:334] "Generic (PLEG): container finished" podID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" exitCode=0 Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361690 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vnc4k" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361721 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7"} Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361754 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vnc4k" event={"ID":"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca","Type":"ContainerDied","Data":"3fa0411158a515661fc93f23367f71b3fb55a578ca344043d47fb79a8a6b6cd1"} Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.361772 4790 scope.go:117] "RemoveContainer" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.363592 4790 generic.go:334] "Generic (PLEG): container finished" podID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerID="089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093" exitCode=0 Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.363638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerDied","Data":"089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093"} Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.387931 4790 scope.go:117] "RemoveContainer" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.390444 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") pod \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.390504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") pod \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.390548 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") pod \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\" (UID: \"8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca\") " Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.391698 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities" (OuterVolumeSpecName: "utilities") pod "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" (UID: "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.404918 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb" (OuterVolumeSpecName: "kube-api-access-mzxcb") pod "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" (UID: "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca"). InnerVolumeSpecName "kube-api-access-mzxcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.409208 4790 scope.go:117] "RemoveContainer" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.462295 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" (UID: "8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.492931 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.493190 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzxcb\" (UniqueName: \"kubernetes.io/projected/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-kube-api-access-mzxcb\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.493278 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.494376 4790 scope.go:117] "RemoveContainer" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" Mar 13 20:56:03 crc kubenswrapper[4790]: E0313 20:56:03.495015 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7\": container with ID starting with 5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7 not found: ID does not exist" containerID="5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495061 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7"} err="failed to get container status \"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7\": rpc error: code = NotFound desc = could not find container \"5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7\": container with ID starting with 5f175ab37bfbe4f9c0dfee7f45b64e0fb111149fb1e36facd0869fb7a35a3ce7 not found: ID does not exist" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495086 4790 scope.go:117] "RemoveContainer" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" Mar 13 20:56:03 crc kubenswrapper[4790]: E0313 20:56:03.495554 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d\": container with ID starting with f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d not found: ID does not exist" containerID="f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495618 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d"} err="failed to get container status \"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d\": rpc error: code = NotFound desc = could not find container \"f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d\": container with ID starting with f4e6f375194c9d82edee3c8c0e5a0033a58da92d8a71aefbe41d5716b314e80d not found: ID does not exist" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.495666 4790 scope.go:117] "RemoveContainer" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" Mar 13 20:56:03 crc kubenswrapper[4790]: E0313 20:56:03.495968 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9\": container with ID starting with a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9 not found: ID does not exist" containerID="a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.496063 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9"} err="failed to get container status \"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9\": rpc error: code = NotFound desc = could not find container \"a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9\": container with ID starting with a222f6aee417963dd7b4246a1062f198df423f2dc646db76b7b1c2d5761570d9 not found: ID does not exist" Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.710424 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:56:03 crc kubenswrapper[4790]: I0313 20:56:03.718503 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vnc4k"] Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.706552 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.828204 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") pod \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\" (UID: \"4ffc58ad-c12d-4165-bc92-1e948aa14c42\") " Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.832339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6" (OuterVolumeSpecName: "kube-api-access-5nrf6") pod "4ffc58ad-c12d-4165-bc92-1e948aa14c42" (UID: "4ffc58ad-c12d-4165-bc92-1e948aa14c42"). InnerVolumeSpecName "kube-api-access-5nrf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:56:04 crc kubenswrapper[4790]: I0313 20:56:04.930415 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nrf6\" (UniqueName: \"kubernetes.io/projected/4ffc58ad-c12d-4165-bc92-1e948aa14c42-kube-api-access-5nrf6\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.386432 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557256-v26h5" event={"ID":"4ffc58ad-c12d-4165-bc92-1e948aa14c42","Type":"ContainerDied","Data":"11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4"} Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.386477 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11e62a2b63d556f601295cc99baf4b5ec1decd822d940365ab70f3e1697b0ce4" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.386688 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557256-v26h5" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.437436 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.446918 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557250-wqt56"] Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.672922 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" path="/var/lib/kubelet/pods/8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca/volumes" Mar 13 20:56:05 crc kubenswrapper[4790]: I0313 20:56:05.674082 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00a5fd8-e634-4969-90ad-6850179e7de1" path="/var/lib/kubelet/pods/d00a5fd8-e634-4969-90ad-6850179e7de1/volumes" Mar 13 20:56:07 crc kubenswrapper[4790]: I0313 20:56:07.407594 4790 generic.go:334] "Generic (PLEG): container finished" podID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerID="ef1ad01ff1610150e75c805dfbe677ad94c23d1f578c4b9bb8893fd71bbdb07d" exitCode=0 Mar 13 20:56:07 crc kubenswrapper[4790]: I0313 20:56:07.407681 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerDied","Data":"ef1ad01ff1610150e75c805dfbe677ad94c23d1f578c4b9bb8893fd71bbdb07d"} Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.660715 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:08 crc kubenswrapper[4790]: E0313 20:56:08.661314 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.811141 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913646 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913735 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.913835 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") pod \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\" (UID: \"5fc3181b-a2df-4d5c-afa1-057cef46dd95\") " Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.920357 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.920745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4" (OuterVolumeSpecName: "kube-api-access-ctzx4") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "kube-api-access-ctzx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.942467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory" (OuterVolumeSpecName: "inventory") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:56:08 crc kubenswrapper[4790]: I0313 20:56:08.951896 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5fc3181b-a2df-4d5c-afa1-057cef46dd95" (UID: "5fc3181b-a2df-4d5c-afa1-057cef46dd95"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016226 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctzx4\" (UniqueName: \"kubernetes.io/projected/5fc3181b-a2df-4d5c-afa1-057cef46dd95-kube-api-access-ctzx4\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016261 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016271 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.016281 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5fc3181b-a2df-4d5c-afa1-057cef46dd95-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.428246 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" event={"ID":"5fc3181b-a2df-4d5c-afa1-057cef46dd95","Type":"ContainerDied","Data":"8a8a4b31d38642270b5c6ca8e8476670fc95c963faff03b5219523182e59cc45"} Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.428285 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.428292 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a8a4b31d38642270b5c6ca8e8476670fc95c963faff03b5219523182e59cc45" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.501714 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58"] Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.502746 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.502872 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.502961 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-utilities" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503033 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-utilities" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.503104 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerName="oc" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503184 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerName="oc" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.503281 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-content" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503365 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="extract-content" Mar 13 20:56:09 crc kubenswrapper[4790]: E0313 20:56:09.503473 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503544 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503846 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" containerName="oc" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.503949 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f5c1f9b-56fb-46e2-99c0-b79ab684e1ca" containerName="registry-server" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.504053 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fc3181b-a2df-4d5c-afa1-057cef46dd95" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.504942 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.506650 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.506844 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.507325 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.507706 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.513768 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58"] Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.629877 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.630123 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.630175 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.731703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.731751 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.731925 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.745246 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.745349 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.750586 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-cfp58\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:09 crc kubenswrapper[4790]: I0313 20:56:09.827613 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:56:10 crc kubenswrapper[4790]: I0313 20:56:10.201759 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58"] Mar 13 20:56:10 crc kubenswrapper[4790]: W0313 20:56:10.205226 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod304addb4_f579_42f8_87d8_8e15b713aef2.slice/crio-18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf WatchSource:0}: Error finding container 18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf: Status 404 returned error can't find the container with id 18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf Mar 13 20:56:10 crc kubenswrapper[4790]: I0313 20:56:10.438919 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerStarted","Data":"18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf"} Mar 13 20:56:11 crc kubenswrapper[4790]: I0313 20:56:11.465791 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerStarted","Data":"c52e39aa381aa5d2fab1bd41bd98b85078dce93efb6dfc416500534a84998765"} Mar 13 20:56:11 crc kubenswrapper[4790]: I0313 20:56:11.486070 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" podStartSLOduration=2.001567418 podStartE2EDuration="2.486053856s" podCreationTimestamp="2026-03-13 20:56:09 +0000 UTC" firstStartedPulling="2026-03-13 20:56:10.207686019 +0000 UTC m=+1701.228801910" lastFinishedPulling="2026-03-13 20:56:10.692172447 +0000 UTC m=+1701.713288348" observedRunningTime="2026-03-13 20:56:11.484284897 +0000 UTC m=+1702.505400788" watchObservedRunningTime="2026-03-13 20:56:11.486053856 +0000 UTC m=+1702.507169747" Mar 13 20:56:21 crc kubenswrapper[4790]: I0313 20:56:21.659864 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:21 crc kubenswrapper[4790]: E0313 20:56:21.660576 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:32 crc kubenswrapper[4790]: I0313 20:56:32.661114 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:32 crc kubenswrapper[4790]: E0313 20:56:32.661971 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.037551 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.065964 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.074497 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-bc9a-account-create-update-7s4hb"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.082740 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-qflsz"] Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.668729 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b5f7e2a-401c-4a9f-9222-5037f9d1d499" path="/var/lib/kubelet/pods/1b5f7e2a-401c-4a9f-9222-5037f9d1d499/volumes" Mar 13 20:56:33 crc kubenswrapper[4790]: I0313 20:56:33.669269 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bfc00cf-9a76-4b6f-a8f5-315af824814d" path="/var/lib/kubelet/pods/9bfc00cf-9a76-4b6f-a8f5-315af824814d/volumes" Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.029762 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.039303 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.050479 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.059118 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.068075 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-6245-account-create-update-5tjxd"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.076686 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-swgpr"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.085763 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-dtps4"] Mar 13 20:56:34 crc kubenswrapper[4790]: I0313 20:56:34.095052 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-76eb-account-create-update-fsrb9"] Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.669127 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c" path="/var/lib/kubelet/pods/0b6f7fe9-fb1f-430c-80e5-0dbe98da2b9c/volumes" Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.669674 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a3a1cb-c500-4355-ae67-649e381b1b88" path="/var/lib/kubelet/pods/20a3a1cb-c500-4355-ae67-649e381b1b88/volumes" Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.670287 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a7e4224-0922-4f9a-af94-0a9933f27530" path="/var/lib/kubelet/pods/2a7e4224-0922-4f9a-af94-0a9933f27530/volumes" Mar 13 20:56:35 crc kubenswrapper[4790]: I0313 20:56:35.670918 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0" path="/var/lib/kubelet/pods/a0a6f76f-d9d1-4ab9-ac4c-e483e55926a0/volumes" Mar 13 20:56:43 crc kubenswrapper[4790]: I0313 20:56:43.660233 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:43 crc kubenswrapper[4790]: E0313 20:56:43.661082 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:56:49 crc kubenswrapper[4790]: I0313 20:56:49.042610 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:56:49 crc kubenswrapper[4790]: I0313 20:56:49.052717 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fjjbp"] Mar 13 20:56:49 crc kubenswrapper[4790]: I0313 20:56:49.675535 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfa975ed-d42b-43be-91a1-4a2288005883" path="/var/lib/kubelet/pods/cfa975ed-d42b-43be-91a1-4a2288005883/volumes" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.461092 4790 scope.go:117] "RemoveContainer" containerID="5bf52c9a0edc80ae6550c060c79e70ad8f311cf1880d5319a92eed662b3ae498" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.482956 4790 scope.go:117] "RemoveContainer" containerID="684aff5511e6e0a081533906daec355673be31064917c7fdefb18571783852b8" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.522558 4790 scope.go:117] "RemoveContainer" containerID="caad9d6f6144a7c4b4a17b2bfe51bfa98c2031dcffb22ecbae67c200ab59beba" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.546040 4790 scope.go:117] "RemoveContainer" containerID="e50d6c82675c18c36b9041dc6a13dffb21bb7a9c1cb73ee61c06ce0d61f0b9b3" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.587813 4790 scope.go:117] "RemoveContainer" containerID="eaedee9332ceb5ac2c43fa820fcea3e6086d5dfda3317381786c3cc819576b44" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.639807 4790 scope.go:117] "RemoveContainer" containerID="2d9bc31a36f8979f03c449ef60b47d579e8e6f07093cd0f2e81bc56503b15368" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.691188 4790 scope.go:117] "RemoveContainer" containerID="3a871452c0d8f0bdf8b93e4dc697c5c69984d2e498b186e2955afdc399d1238f" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.716629 4790 scope.go:117] "RemoveContainer" containerID="a46a82afe76ba100b2ac912d7fb0a03ce75de0a957f3543d9259571fea13e90c" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.755719 4790 scope.go:117] "RemoveContainer" containerID="4ebc3b3d24c09595199a728e2bcc2be34cd5ab68545cd7072d9ba0e08a6b3dd5" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.780426 4790 scope.go:117] "RemoveContainer" containerID="32a172eb03df58396e7932c7253bd1c867efa547f28c6e6ae81472b2dd89ad69" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.799623 4790 scope.go:117] "RemoveContainer" containerID="7987e2881155643a03157d6664eb40102409516e2d6983b6bde0009190d2b009" Mar 13 20:56:53 crc kubenswrapper[4790]: I0313 20:56:53.821407 4790 scope.go:117] "RemoveContainer" containerID="ff9f56b80e2e388086557f7fc707002adf2609bc96cff97367abf262894bf61f" Mar 13 20:56:58 crc kubenswrapper[4790]: I0313 20:56:58.660242 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:56:58 crc kubenswrapper[4790]: E0313 20:56:58.660974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:01 crc kubenswrapper[4790]: I0313 20:57:01.032569 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:57:01 crc kubenswrapper[4790]: I0313 20:57:01.042013 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-pshzp"] Mar 13 20:57:01 crc kubenswrapper[4790]: I0313 20:57:01.672179 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a93720f0-c882-49d8-bd56-7d77237da6e7" path="/var/lib/kubelet/pods/a93720f0-c882-49d8-bd56-7d77237da6e7/volumes" Mar 13 20:57:11 crc kubenswrapper[4790]: I0313 20:57:11.660319 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:11 crc kubenswrapper[4790]: E0313 20:57:11.661159 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:15 crc kubenswrapper[4790]: I0313 20:57:15.050490 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:57:15 crc kubenswrapper[4790]: I0313 20:57:15.071303 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3bc0-account-create-update-ntn27"] Mar 13 20:57:15 crc kubenswrapper[4790]: I0313 20:57:15.671932 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cfc09f48-1b0c-45fe-be9b-8bf3a3af887c" path="/var/lib/kubelet/pods/cfc09f48-1b0c-45fe-be9b-8bf3a3af887c/volumes" Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.028668 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.042645 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-f926w"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.050167 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.060785 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.080552 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.080619 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.090513 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4p54c"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.100373 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-4d80-account-create-update-7trkt"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.108251 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-eae0-account-create-update-ljhjl"] Mar 13 20:57:16 crc kubenswrapper[4790]: I0313 20:57:16.115205 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-56s96"] Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.670299 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dd76b06-ea34-4044-bba0-cf5e6e822b6b" path="/var/lib/kubelet/pods/1dd76b06-ea34-4044-bba0-cf5e6e822b6b/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.671124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e11abfd-7d59-479b-9f77-cbbd22cbf48c" path="/var/lib/kubelet/pods/8e11abfd-7d59-479b-9f77-cbbd22cbf48c/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.671647 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e551be1a-728e-4851-894c-30b4493326d6" path="/var/lib/kubelet/pods/e551be1a-728e-4851-894c-30b4493326d6/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.672180 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7d496eb-3f17-4e7b-9a68-c91dec27355a" path="/var/lib/kubelet/pods/e7d496eb-3f17-4e7b-9a68-c91dec27355a/volumes" Mar 13 20:57:17 crc kubenswrapper[4790]: I0313 20:57:17.673165 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc51014c-323e-4a6b-9202-edc7b135809d" path="/var/lib/kubelet/pods/fc51014c-323e-4a6b-9202-edc7b135809d/volumes" Mar 13 20:57:20 crc kubenswrapper[4790]: I0313 20:57:20.030248 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:57:20 crc kubenswrapper[4790]: I0313 20:57:20.040871 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-jf9fb"] Mar 13 20:57:21 crc kubenswrapper[4790]: I0313 20:57:21.670812 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4214f238-4044-45ab-8e40-48894500f25f" path="/var/lib/kubelet/pods/4214f238-4044-45ab-8e40-48894500f25f/volumes" Mar 13 20:57:26 crc kubenswrapper[4790]: I0313 20:57:26.660173 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:26 crc kubenswrapper[4790]: E0313 20:57:26.661033 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:37 crc kubenswrapper[4790]: I0313 20:57:37.661039 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:37 crc kubenswrapper[4790]: E0313 20:57:37.661911 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:48 crc kubenswrapper[4790]: I0313 20:57:48.660044 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:48 crc kubenswrapper[4790]: E0313 20:57:48.660795 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:49 crc kubenswrapper[4790]: I0313 20:57:49.312612 4790 generic.go:334] "Generic (PLEG): container finished" podID="304addb4-f579-42f8-87d8-8e15b713aef2" containerID="c52e39aa381aa5d2fab1bd41bd98b85078dce93efb6dfc416500534a84998765" exitCode=0 Mar 13 20:57:49 crc kubenswrapper[4790]: I0313 20:57:49.312670 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerDied","Data":"c52e39aa381aa5d2fab1bd41bd98b85078dce93efb6dfc416500534a84998765"} Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.053307 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.060737 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-mg4xg"] Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.715277 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.827406 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") pod \"304addb4-f579-42f8-87d8-8e15b713aef2\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.827491 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") pod \"304addb4-f579-42f8-87d8-8e15b713aef2\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.827576 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") pod \"304addb4-f579-42f8-87d8-8e15b713aef2\" (UID: \"304addb4-f579-42f8-87d8-8e15b713aef2\") " Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.833551 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2" (OuterVolumeSpecName: "kube-api-access-57sh2") pod "304addb4-f579-42f8-87d8-8e15b713aef2" (UID: "304addb4-f579-42f8-87d8-8e15b713aef2"). InnerVolumeSpecName "kube-api-access-57sh2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.858860 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory" (OuterVolumeSpecName: "inventory") pod "304addb4-f579-42f8-87d8-8e15b713aef2" (UID: "304addb4-f579-42f8-87d8-8e15b713aef2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.859664 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "304addb4-f579-42f8-87d8-8e15b713aef2" (UID: "304addb4-f579-42f8-87d8-8e15b713aef2"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.931673 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.931949 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57sh2\" (UniqueName: \"kubernetes.io/projected/304addb4-f579-42f8-87d8-8e15b713aef2-kube-api-access-57sh2\") on node \"crc\" DevicePath \"\"" Mar 13 20:57:50 crc kubenswrapper[4790]: I0313 20:57:50.932031 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/304addb4-f579-42f8-87d8-8e15b713aef2-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.330509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" event={"ID":"304addb4-f579-42f8-87d8-8e15b713aef2","Type":"ContainerDied","Data":"18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf"} Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.330800 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18ec06c077aaf4342514bd8545ccf6c47aa8e8c7b737cbfc0c2e1411462641bf" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.330619 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-cfp58" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.460346 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564"] Mar 13 20:57:51 crc kubenswrapper[4790]: E0313 20:57:51.461482 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="304addb4-f579-42f8-87d8-8e15b713aef2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.461507 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="304addb4-f579-42f8-87d8-8e15b713aef2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.462124 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="304addb4-f579-42f8-87d8-8e15b713aef2" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.463532 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.497853 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.498131 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.498306 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.498703 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.506235 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564"] Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.552176 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.552256 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.552660 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.654574 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.655281 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.655439 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.665313 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.671857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.675715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-vg564\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.675795 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eef97bfb-4275-4a0a-bae4-5442cf7400dd" path="/var/lib/kubelet/pods/eef97bfb-4275-4a0a-bae4-5442cf7400dd/volumes" Mar 13 20:57:51 crc kubenswrapper[4790]: I0313 20:57:51.840279 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:57:52 crc kubenswrapper[4790]: I0313 20:57:52.400992 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564"] Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.359001 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerStarted","Data":"0f2da745b394b5be4861e2d2e60fb64fdc25fcc05f8c0e3c406f5c5afdec6971"} Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.359335 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerStarted","Data":"33fb4801a3e818d29df5755724333f7626e2f157952f8e54477ff7fc99bb6957"} Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.385736 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" podStartSLOduration=1.908970958 podStartE2EDuration="2.385713512s" podCreationTimestamp="2026-03-13 20:57:51 +0000 UTC" firstStartedPulling="2026-03-13 20:57:52.407564734 +0000 UTC m=+1803.428680625" lastFinishedPulling="2026-03-13 20:57:52.884307288 +0000 UTC m=+1803.905423179" observedRunningTime="2026-03-13 20:57:53.378648368 +0000 UTC m=+1804.399764259" watchObservedRunningTime="2026-03-13 20:57:53.385713512 +0000 UTC m=+1804.406829423" Mar 13 20:57:53 crc kubenswrapper[4790]: I0313 20:57:53.997303 4790 scope.go:117] "RemoveContainer" containerID="7f1ca4be311e4bf8899acd7ffc7b40f8dd562b652669b076fe646ca2df5ae15e" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.023572 4790 scope.go:117] "RemoveContainer" containerID="5b9b7cadced0d29da460e85098fd79f31bf772b7450962d6c1f3bf09b62a0134" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.077515 4790 scope.go:117] "RemoveContainer" containerID="37311d8f14a45460392cc2657752fc09be6fc325071ebe0626eb04d799e80545" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.111398 4790 scope.go:117] "RemoveContainer" containerID="0f4d13a4ad3c2ce36bd8fc01aafd587a060f2b33fce34cbf54f0cbd83e9fb1ca" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.488225 4790 scope.go:117] "RemoveContainer" containerID="4f445f85254948b2a82910d93997f50d41021103d40e52ebd6447aec6a71de39" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.510630 4790 scope.go:117] "RemoveContainer" containerID="047c96b0959e792e896cbcb062d30482e777ac7ce2334a4427efe91c5a39d9a3" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.563856 4790 scope.go:117] "RemoveContainer" containerID="7db39c36784dd09efea0e74c586352f81de4ffb0a5c4d04fdfe061e937df855c" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.596856 4790 scope.go:117] "RemoveContainer" containerID="fb829732267d5d36436612626f2036bb0698b4bd86f5c88383f3ee7aba396142" Mar 13 20:57:54 crc kubenswrapper[4790]: I0313 20:57:54.618158 4790 scope.go:117] "RemoveContainer" containerID="36f3978e6e158babd7d1c6c18b804e801c1d5a860c6298e1e465b9030818d00c" Mar 13 20:57:58 crc kubenswrapper[4790]: I0313 20:57:58.032810 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:57:58 crc kubenswrapper[4790]: I0313 20:57:58.041943 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-wbb8v"] Mar 13 20:57:59 crc kubenswrapper[4790]: I0313 20:57:59.667888 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:57:59 crc kubenswrapper[4790]: E0313 20:57:59.668365 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:57:59 crc kubenswrapper[4790]: I0313 20:57:59.683007 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8b8bbca-4be9-43d3-b692-0587892a50b4" path="/var/lib/kubelet/pods/e8b8bbca-4be9-43d3-b692-0587892a50b4/volumes" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.135857 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.138036 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.141056 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.141113 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.141064 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.143361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.184616 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"auto-csr-approver-29557258-gqmrr\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.285779 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"auto-csr-approver-29557258-gqmrr\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.305184 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"auto-csr-approver-29557258-gqmrr\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.494897 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:00 crc kubenswrapper[4790]: I0313 20:58:00.942067 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 20:58:01 crc kubenswrapper[4790]: I0313 20:58:01.564593 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" event={"ID":"7706813b-e8e7-4b17-ba18-993c121eed66","Type":"ContainerStarted","Data":"27f1bb9cc432cac3351638db6116ab3783e0e90cad2ade62e350ed91b52d3746"} Mar 13 20:58:02 crc kubenswrapper[4790]: I0313 20:58:02.587733 4790 generic.go:334] "Generic (PLEG): container finished" podID="7706813b-e8e7-4b17-ba18-993c121eed66" containerID="98d6a341587e40eeb366a4b8a2eab51c3ea58fa67b5db767f9e2261febd34d64" exitCode=0 Mar 13 20:58:02 crc kubenswrapper[4790]: I0313 20:58:02.588085 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" event={"ID":"7706813b-e8e7-4b17-ba18-993c121eed66","Type":"ContainerDied","Data":"98d6a341587e40eeb366a4b8a2eab51c3ea58fa67b5db767f9e2261febd34d64"} Mar 13 20:58:03 crc kubenswrapper[4790]: I0313 20:58:03.909914 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.035494 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.045066 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m4zxn"] Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.057540 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") pod \"7706813b-e8e7-4b17-ba18-993c121eed66\" (UID: \"7706813b-e8e7-4b17-ba18-993c121eed66\") " Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.064682 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5" (OuterVolumeSpecName: "kube-api-access-g6st5") pod "7706813b-e8e7-4b17-ba18-993c121eed66" (UID: "7706813b-e8e7-4b17-ba18-993c121eed66"). InnerVolumeSpecName "kube-api-access-g6st5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.159439 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6st5\" (UniqueName: \"kubernetes.io/projected/7706813b-e8e7-4b17-ba18-993c121eed66-kube-api-access-g6st5\") on node \"crc\" DevicePath \"\"" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.608327 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" event={"ID":"7706813b-e8e7-4b17-ba18-993c121eed66","Type":"ContainerDied","Data":"27f1bb9cc432cac3351638db6116ab3783e0e90cad2ade62e350ed91b52d3746"} Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.608365 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="27f1bb9cc432cac3351638db6116ab3783e0e90cad2ade62e350ed91b52d3746" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.608450 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557258-gqmrr" Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.964530 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:58:04 crc kubenswrapper[4790]: I0313 20:58:04.971979 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557252-mfnmk"] Mar 13 20:58:05 crc kubenswrapper[4790]: I0313 20:58:05.669528 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b77751d8-7e07-4d67-9bed-3858cbfc5c3f" path="/var/lib/kubelet/pods/b77751d8-7e07-4d67-9bed-3858cbfc5c3f/volumes" Mar 13 20:58:05 crc kubenswrapper[4790]: I0313 20:58:05.670426 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd2c3694-0492-400f-98bd-b3c641edfac0" path="/var/lib/kubelet/pods/dd2c3694-0492-400f-98bd-b3c641edfac0/volumes" Mar 13 20:58:09 crc kubenswrapper[4790]: I0313 20:58:09.038582 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:58:09 crc kubenswrapper[4790]: I0313 20:58:09.049157 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-kkmzk"] Mar 13 20:58:09 crc kubenswrapper[4790]: I0313 20:58:09.671288 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dff6930-5d07-4df7-8d42-470ae83afd38" path="/var/lib/kubelet/pods/5dff6930-5d07-4df7-8d42-470ae83afd38/volumes" Mar 13 20:58:11 crc kubenswrapper[4790]: I0313 20:58:11.660344 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:11 crc kubenswrapper[4790]: E0313 20:58:11.661184 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:16 crc kubenswrapper[4790]: I0313 20:58:16.055835 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:58:16 crc kubenswrapper[4790]: I0313 20:58:16.063853 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-g2nmn"] Mar 13 20:58:17 crc kubenswrapper[4790]: I0313 20:58:17.671751 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32ffb609-7a3b-42b7-b513-7003deefe5dd" path="/var/lib/kubelet/pods/32ffb609-7a3b-42b7-b513-7003deefe5dd/volumes" Mar 13 20:58:26 crc kubenswrapper[4790]: I0313 20:58:26.660519 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:26 crc kubenswrapper[4790]: E0313 20:58:26.661328 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:40 crc kubenswrapper[4790]: I0313 20:58:40.659495 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:40 crc kubenswrapper[4790]: E0313 20:58:40.660169 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:51 crc kubenswrapper[4790]: I0313 20:58:51.660197 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:58:51 crc kubenswrapper[4790]: E0313 20:58:51.661913 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:58:54 crc kubenswrapper[4790]: I0313 20:58:54.770907 4790 scope.go:117] "RemoveContainer" containerID="b5ea61f802c1b094e15351a6cc95042eca8f16ab2272c8f7af336afbb299a8d5" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.245407 4790 scope.go:117] "RemoveContainer" containerID="51c35566a48d60d5e5b84368517b8e770f4896138c85e1636c2114cd13bfa196" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.281682 4790 scope.go:117] "RemoveContainer" containerID="de4f3208380e46019eb11e33bfcd9916170845c8672c15c2d9cbbb7f438283bb" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.347335 4790 scope.go:117] "RemoveContainer" containerID="062bb846937d0ad9d07de45246277a5920215b483e1948fdfbd9ea7168c9a51a" Mar 13 20:58:55 crc kubenswrapper[4790]: I0313 20:58:55.378773 4790 scope.go:117] "RemoveContainer" containerID="f2216663957b1ff7be0364b827b231924669a938cca6695aaf9da572dc71b0b9" Mar 13 20:59:01 crc kubenswrapper[4790]: I0313 20:59:01.085793 4790 generic.go:334] "Generic (PLEG): container finished" podID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerID="0f2da745b394b5be4861e2d2e60fb64fdc25fcc05f8c0e3c406f5c5afdec6971" exitCode=0 Mar 13 20:59:01 crc kubenswrapper[4790]: I0313 20:59:01.086314 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerDied","Data":"0f2da745b394b5be4861e2d2e60fb64fdc25fcc05f8c0e3c406f5c5afdec6971"} Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.054274 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.063639 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.071901 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.079525 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-lrnph"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.087294 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-jjv8c"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.095369 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-fe05-account-create-update-dwwd8"] Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.491244 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.568206 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") pod \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.568273 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") pod \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.568319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") pod \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\" (UID: \"c1609d29-96e5-43eb-a086-5587ca7c4f5a\") " Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.573828 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7" (OuterVolumeSpecName: "kube-api-access-9s5f7") pod "c1609d29-96e5-43eb-a086-5587ca7c4f5a" (UID: "c1609d29-96e5-43eb-a086-5587ca7c4f5a"). InnerVolumeSpecName "kube-api-access-9s5f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.593784 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory" (OuterVolumeSpecName: "inventory") pod "c1609d29-96e5-43eb-a086-5587ca7c4f5a" (UID: "c1609d29-96e5-43eb-a086-5587ca7c4f5a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.604800 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c1609d29-96e5-43eb-a086-5587ca7c4f5a" (UID: "c1609d29-96e5-43eb-a086-5587ca7c4f5a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.670572 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.670621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s5f7\" (UniqueName: \"kubernetes.io/projected/c1609d29-96e5-43eb-a086-5587ca7c4f5a-kube-api-access-9s5f7\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:02 crc kubenswrapper[4790]: I0313 20:59:02.670630 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1609d29-96e5-43eb-a086-5587ca7c4f5a-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.035782 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.047756 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.056156 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.064361 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-kq55v"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.072784 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-8821-account-create-update-l6ffx"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.083282 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-926f-account-create-update-nnl2f"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.103465 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" event={"ID":"c1609d29-96e5-43eb-a086-5587ca7c4f5a","Type":"ContainerDied","Data":"33fb4801a3e818d29df5755724333f7626e2f157952f8e54477ff7fc99bb6957"} Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.103511 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33fb4801a3e818d29df5755724333f7626e2f157952f8e54477ff7fc99bb6957" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.103533 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-vg564" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.201890 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff"] Mar 13 20:59:03 crc kubenswrapper[4790]: E0313 20:59:03.202245 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" containerName="oc" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202264 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" containerName="oc" Mar 13 20:59:03 crc kubenswrapper[4790]: E0313 20:59:03.202289 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202297 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202503 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1609d29-96e5-43eb-a086-5587ca7c4f5a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.202534 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" containerName="oc" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.203141 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.210337 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.210542 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.210832 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.211108 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.217652 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff"] Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.279193 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.279284 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.279314 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: E0313 20:59:03.289092 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1609d29_96e5_43eb_a086_5587ca7c4f5a.slice\": RecentStats: unable to find data in memory cache]" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.381473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.381585 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.381626 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.387133 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.398149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.402891 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-n54ff\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.529127 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.671417 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f4f78b-ccfb-4413-9a81-d5b461a5e319" path="/var/lib/kubelet/pods/00f4f78b-ccfb-4413-9a81-d5b461a5e319/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.672225 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4ef124-b4dd-43df-bdfb-97c65685977c" path="/var/lib/kubelet/pods/1a4ef124-b4dd-43df-bdfb-97c65685977c/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.672842 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="536b2b85-21d0-47ba-8825-998dcb7b0058" path="/var/lib/kubelet/pods/536b2b85-21d0-47ba-8825-998dcb7b0058/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.673388 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86c0a379-8f0b-4414-863c-eaed0745ce2d" path="/var/lib/kubelet/pods/86c0a379-8f0b-4414-863c-eaed0745ce2d/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.674509 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c861107-6a1d-49f7-bc63-b95008ee5ddc" path="/var/lib/kubelet/pods/9c861107-6a1d-49f7-bc63-b95008ee5ddc/volumes" Mar 13 20:59:03 crc kubenswrapper[4790]: I0313 20:59:03.675064 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dcc0f61e-f0ce-4443-9eec-0488ff92b388" path="/var/lib/kubelet/pods/dcc0f61e-f0ce-4443-9eec-0488ff92b388/volumes" Mar 13 20:59:04 crc kubenswrapper[4790]: W0313 20:59:04.044575 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod20beb5d9_49e6_47c7_a3ad_107ff79e56fd.slice/crio-f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0 WatchSource:0}: Error finding container f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0: Status 404 returned error can't find the container with id f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0 Mar 13 20:59:04 crc kubenswrapper[4790]: I0313 20:59:04.045350 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff"] Mar 13 20:59:04 crc kubenswrapper[4790]: I0313 20:59:04.048525 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 20:59:04 crc kubenswrapper[4790]: I0313 20:59:04.111665 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerStarted","Data":"f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0"} Mar 13 20:59:05 crc kubenswrapper[4790]: I0313 20:59:05.121737 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerStarted","Data":"21a384acf2e349aeba270f32d4e5aa2a5df9d099b690b8030fa9db6794e0997f"} Mar 13 20:59:05 crc kubenswrapper[4790]: I0313 20:59:05.143248 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" podStartSLOduration=1.4373358760000001 podStartE2EDuration="2.143228204s" podCreationTimestamp="2026-03-13 20:59:03 +0000 UTC" firstStartedPulling="2026-03-13 20:59:04.048253194 +0000 UTC m=+1875.069369075" lastFinishedPulling="2026-03-13 20:59:04.754145292 +0000 UTC m=+1875.775261403" observedRunningTime="2026-03-13 20:59:05.135150473 +0000 UTC m=+1876.156266364" watchObservedRunningTime="2026-03-13 20:59:05.143228204 +0000 UTC m=+1876.164344095" Mar 13 20:59:05 crc kubenswrapper[4790]: I0313 20:59:05.660367 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:05 crc kubenswrapper[4790]: E0313 20:59:05.660950 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:10 crc kubenswrapper[4790]: I0313 20:59:10.165277 4790 generic.go:334] "Generic (PLEG): container finished" podID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerID="21a384acf2e349aeba270f32d4e5aa2a5df9d099b690b8030fa9db6794e0997f" exitCode=0 Mar 13 20:59:10 crc kubenswrapper[4790]: I0313 20:59:10.165351 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerDied","Data":"21a384acf2e349aeba270f32d4e5aa2a5df9d099b690b8030fa9db6794e0997f"} Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.572677 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.640123 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") pod \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.640319 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") pod \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.640357 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") pod \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\" (UID: \"20beb5d9-49e6-47c7-a3ad-107ff79e56fd\") " Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.645844 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc" (OuterVolumeSpecName: "kube-api-access-7phtc") pod "20beb5d9-49e6-47c7-a3ad-107ff79e56fd" (UID: "20beb5d9-49e6-47c7-a3ad-107ff79e56fd"). InnerVolumeSpecName "kube-api-access-7phtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.666815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory" (OuterVolumeSpecName: "inventory") pod "20beb5d9-49e6-47c7-a3ad-107ff79e56fd" (UID: "20beb5d9-49e6-47c7-a3ad-107ff79e56fd"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.668200 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "20beb5d9-49e6-47c7-a3ad-107ff79e56fd" (UID: "20beb5d9-49e6-47c7-a3ad-107ff79e56fd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.742784 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.742823 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:11 crc kubenswrapper[4790]: I0313 20:59:11.742837 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7phtc\" (UniqueName: \"kubernetes.io/projected/20beb5d9-49e6-47c7-a3ad-107ff79e56fd-kube-api-access-7phtc\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.182268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" event={"ID":"20beb5d9-49e6-47c7-a3ad-107ff79e56fd","Type":"ContainerDied","Data":"f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0"} Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.182570 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f71719d135da71fc8f3439eeddb8cb7ddd2e732ead2768f62f79060482322eb0" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.182416 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-n54ff" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.246275 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b"] Mar 13 20:59:12 crc kubenswrapper[4790]: E0313 20:59:12.246757 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.246774 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.247002 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="20beb5d9-49e6-47c7-a3ad-107ff79e56fd" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.247694 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.250708 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.251511 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.251897 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.252285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.257755 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b"] Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.354435 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.354867 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.354914 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.457510 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.457591 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.457721 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.461979 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.462990 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.478111 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-mjj4b\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:12 crc kubenswrapper[4790]: I0313 20:59:12.565893 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:13 crc kubenswrapper[4790]: I0313 20:59:13.091003 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b"] Mar 13 20:59:13 crc kubenswrapper[4790]: I0313 20:59:13.190727 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerStarted","Data":"4c0f8c6ac936b6063988c5028348af88d79a2e886d04d046b7d47b0ee8ff6d1a"} Mar 13 20:59:14 crc kubenswrapper[4790]: I0313 20:59:14.203232 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerStarted","Data":"681a7ba31862621718cb104c0d209c0de7ce953a8831d62f8f14d103d63a60a9"} Mar 13 20:59:14 crc kubenswrapper[4790]: I0313 20:59:14.226243 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" podStartSLOduration=1.816680792 podStartE2EDuration="2.226223958s" podCreationTimestamp="2026-03-13 20:59:12 +0000 UTC" firstStartedPulling="2026-03-13 20:59:13.09720017 +0000 UTC m=+1884.118316101" lastFinishedPulling="2026-03-13 20:59:13.506743376 +0000 UTC m=+1884.527859267" observedRunningTime="2026-03-13 20:59:14.223353469 +0000 UTC m=+1885.244469360" watchObservedRunningTime="2026-03-13 20:59:14.226223958 +0000 UTC m=+1885.247339869" Mar 13 20:59:17 crc kubenswrapper[4790]: I0313 20:59:17.661155 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:17 crc kubenswrapper[4790]: E0313 20:59:17.661935 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:28 crc kubenswrapper[4790]: I0313 20:59:28.660895 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:28 crc kubenswrapper[4790]: E0313 20:59:28.661709 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:35 crc kubenswrapper[4790]: I0313 20:59:35.037329 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:59:35 crc kubenswrapper[4790]: I0313 20:59:35.047653 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-82klj"] Mar 13 20:59:35 crc kubenswrapper[4790]: I0313 20:59:35.671398 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04b866fe-5d7d-46ab-9074-b93ddc7724f0" path="/var/lib/kubelet/pods/04b866fe-5d7d-46ab-9074-b93ddc7724f0/volumes" Mar 13 20:59:40 crc kubenswrapper[4790]: I0313 20:59:40.659745 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:40 crc kubenswrapper[4790]: E0313 20:59:40.660540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 20:59:48 crc kubenswrapper[4790]: I0313 20:59:48.521083 4790 generic.go:334] "Generic (PLEG): container finished" podID="04553c47-94a9-465f-a241-9188784794de" containerID="681a7ba31862621718cb104c0d209c0de7ce953a8831d62f8f14d103d63a60a9" exitCode=0 Mar 13 20:59:48 crc kubenswrapper[4790]: I0313 20:59:48.521116 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerDied","Data":"681a7ba31862621718cb104c0d209c0de7ce953a8831d62f8f14d103d63a60a9"} Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.943482 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.987850 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.988070 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.988145 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:49 crc kubenswrapper[4790]: I0313 20:59:49.993323 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96" (OuterVolumeSpecName: "kube-api-access-s9f96") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de"). InnerVolumeSpecName "kube-api-access-s9f96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 20:59:50 crc kubenswrapper[4790]: E0313 20:59:50.011466 4790 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory podName:04553c47-94a9-465f-a241-9188784794de nodeName:}" failed. No retries permitted until 2026-03-13 20:59:50.511429466 +0000 UTC m=+1921.532545357 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de") : error deleting /var/lib/kubelet/pods/04553c47-94a9-465f-a241-9188784794de/volume-subpaths: remove /var/lib/kubelet/pods/04553c47-94a9-465f-a241-9188784794de/volume-subpaths: no such file or directory Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.014078 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.090907 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9f96\" (UniqueName: \"kubernetes.io/projected/04553c47-94a9-465f-a241-9188784794de-kube-api-access-s9f96\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.090942 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.539347 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" event={"ID":"04553c47-94a9-465f-a241-9188784794de","Type":"ContainerDied","Data":"4c0f8c6ac936b6063988c5028348af88d79a2e886d04d046b7d47b0ee8ff6d1a"} Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.539448 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c0f8c6ac936b6063988c5028348af88d79a2e886d04d046b7d47b0ee8ff6d1a" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.539503 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-mjj4b" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.600477 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") pod \"04553c47-94a9-465f-a241-9188784794de\" (UID: \"04553c47-94a9-465f-a241-9188784794de\") " Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.603889 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory" (OuterVolumeSpecName: "inventory") pod "04553c47-94a9-465f-a241-9188784794de" (UID: "04553c47-94a9-465f-a241-9188784794de"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.641300 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk"] Mar 13 20:59:50 crc kubenswrapper[4790]: E0313 20:59:50.641781 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04553c47-94a9-465f-a241-9188784794de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.641800 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="04553c47-94a9-465f-a241-9188784794de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.642028 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="04553c47-94a9-465f-a241-9188784794de" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.642831 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.647411 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk"] Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.702246 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.702670 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.702788 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.703061 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/04553c47-94a9-465f-a241-9188784794de-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.805305 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.805360 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.805403 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.809559 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.809810 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.821635 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:50 crc kubenswrapper[4790]: I0313 20:59:50.984632 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 20:59:51 crc kubenswrapper[4790]: I0313 20:59:51.478116 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk"] Mar 13 20:59:51 crc kubenswrapper[4790]: W0313 20:59:51.481021 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7e18dc0_dbbb_419e_bdad_22b5f08ffa6f.slice/crio-a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256 WatchSource:0}: Error finding container a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256: Status 404 returned error can't find the container with id a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256 Mar 13 20:59:51 crc kubenswrapper[4790]: I0313 20:59:51.549597 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerStarted","Data":"a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256"} Mar 13 20:59:52 crc kubenswrapper[4790]: I0313 20:59:52.560490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerStarted","Data":"c8ddee344e61f57e55bf975cc9ff728e15bf1f3150e4544252973126358814b9"} Mar 13 20:59:52 crc kubenswrapper[4790]: I0313 20:59:52.578526 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" podStartSLOduration=2.091700361 podStartE2EDuration="2.578509463s" podCreationTimestamp="2026-03-13 20:59:50 +0000 UTC" firstStartedPulling="2026-03-13 20:59:51.483593953 +0000 UTC m=+1922.504709844" lastFinishedPulling="2026-03-13 20:59:51.970403055 +0000 UTC m=+1922.991518946" observedRunningTime="2026-03-13 20:59:52.577334921 +0000 UTC m=+1923.598450832" watchObservedRunningTime="2026-03-13 20:59:52.578509463 +0000 UTC m=+1923.599625354" Mar 13 20:59:52 crc kubenswrapper[4790]: I0313 20:59:52.660238 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.043537 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.051158 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gj4j7"] Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.578996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4"} Mar 13 20:59:53 crc kubenswrapper[4790]: I0313 20:59:53.672002 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71d98c3-e247-448e-945e-016a6755c689" path="/var/lib/kubelet/pods/e71d98c3-e247-448e-945e-016a6755c689/volumes" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.024938 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.032991 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-bh2vb"] Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.559282 4790 scope.go:117] "RemoveContainer" containerID="ac99b8592ceb7c3e6a37fbb0c9de0300f9c9ee5a2b4807abffe2d2ed52e8fe04" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.584462 4790 scope.go:117] "RemoveContainer" containerID="6d9662cc81f66265ce8ecfaf149044a45f9586bc1e7f991bca5d3650ff0fd63f" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.646355 4790 scope.go:117] "RemoveContainer" containerID="749c82e4067fc52a2714101b9401b4c82b0470e8a2bd0821a82732111bf3a2ae" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.672352 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="255451e0-9cb8-424f-a327-6e7ef4e4d775" path="/var/lib/kubelet/pods/255451e0-9cb8-424f-a327-6e7ef4e4d775/volumes" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.673693 4790 scope.go:117] "RemoveContainer" containerID="670aaab126129ee380c6ae05f38d955bab6fe47a4a8d19ac0dbaca35d3cd9ecc" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.738344 4790 scope.go:117] "RemoveContainer" containerID="2532c9c9471a4f51d2c72742172102590d5f8b86465110fbcffff19c31b75b68" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.761018 4790 scope.go:117] "RemoveContainer" containerID="ecda3f7499b0977157d22e381725d43a5571bfd9425676b723008c4d5d967330" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.802953 4790 scope.go:117] "RemoveContainer" containerID="c0e58f35f1d7b48efbdbbc91a297aa591c210bb71e60644cb81c14c40a9e45cb" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.853612 4790 scope.go:117] "RemoveContainer" containerID="d6d96802df47b7b6e53732dfd053c7dabc95a96dcf532db8586c981fb4fcd115" Mar 13 20:59:55 crc kubenswrapper[4790]: I0313 20:59:55.911684 4790 scope.go:117] "RemoveContainer" containerID="15f4fd3d9e2092ff500a17b34ac7be646f532a2e8275aea162c7ec8133dbdbed" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.146421 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.148353 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.150669 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.150668 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.153461 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.155400 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.156733 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.159257 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.162267 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.168039 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.178249 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf"] Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213516 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"auto-csr-approver-29557260-m6wtk\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213800 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213888 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.213963 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315363 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"auto-csr-approver-29557260-m6wtk\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315500 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315531 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.315565 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.316228 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.327328 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.332947 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"auto-csr-approver-29557260-m6wtk\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.332948 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"collect-profiles-29557260-8j5bf\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.469239 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.481599 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:00 crc kubenswrapper[4790]: I0313 21:00:00.922429 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:00:01 crc kubenswrapper[4790]: W0313 21:00:01.016042 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod427c23ef_3e13_432b_98b4_08a6aa5b7cff.slice/crio-e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32 WatchSource:0}: Error finding container e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32: Status 404 returned error can't find the container with id e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32 Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.019065 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf"] Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.668468 4790 generic.go:334] "Generic (PLEG): container finished" podID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerID="6b6ac28f388fd53f46ab8b3943c1bd45ee090848b8aabd597c3e5c7ae5662495" exitCode=0 Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.669490 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerStarted","Data":"e5c5b460ceb349db78baa661bfdbefbccd67389a158be1d27ad4b75995e8252b"} Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.669528 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" event={"ID":"427c23ef-3e13-432b-98b4-08a6aa5b7cff","Type":"ContainerDied","Data":"6b6ac28f388fd53f46ab8b3943c1bd45ee090848b8aabd597c3e5c7ae5662495"} Mar 13 21:00:01 crc kubenswrapper[4790]: I0313 21:00:01.669540 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" event={"ID":"427c23ef-3e13-432b-98b4-08a6aa5b7cff","Type":"ContainerStarted","Data":"e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32"} Mar 13 21:00:02 crc kubenswrapper[4790]: I0313 21:00:02.992003 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.067169 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") pod \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.067255 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") pod \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.067338 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") pod \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\" (UID: \"427c23ef-3e13-432b-98b4-08a6aa5b7cff\") " Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.068625 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume" (OuterVolumeSpecName: "config-volume") pod "427c23ef-3e13-432b-98b4-08a6aa5b7cff" (UID: "427c23ef-3e13-432b-98b4-08a6aa5b7cff"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.073992 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "427c23ef-3e13-432b-98b4-08a6aa5b7cff" (UID: "427c23ef-3e13-432b-98b4-08a6aa5b7cff"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.074280 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96" (OuterVolumeSpecName: "kube-api-access-n7m96") pod "427c23ef-3e13-432b-98b4-08a6aa5b7cff" (UID: "427c23ef-3e13-432b-98b4-08a6aa5b7cff"). InnerVolumeSpecName "kube-api-access-n7m96". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.169657 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/427c23ef-3e13-432b-98b4-08a6aa5b7cff-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.169706 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/427c23ef-3e13-432b-98b4-08a6aa5b7cff-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.169718 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7m96\" (UniqueName: \"kubernetes.io/projected/427c23ef-3e13-432b-98b4-08a6aa5b7cff-kube-api-access-n7m96\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.691616 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" event={"ID":"427c23ef-3e13-432b-98b4-08a6aa5b7cff","Type":"ContainerDied","Data":"e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32"} Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.692155 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e563b5c0f8a16abc16171fe76c0a970215a28ff45f70931625f4bc87069b5e32" Mar 13 21:00:03 crc kubenswrapper[4790]: I0313 21:00:03.692263 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557260-8j5bf" Mar 13 21:00:04 crc kubenswrapper[4790]: I0313 21:00:04.700133 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerStarted","Data":"5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1"} Mar 13 21:00:04 crc kubenswrapper[4790]: I0313 21:00:04.714016 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" podStartSLOduration=1.257896753 podStartE2EDuration="4.713995785s" podCreationTimestamp="2026-03-13 21:00:00 +0000 UTC" firstStartedPulling="2026-03-13 21:00:00.930471785 +0000 UTC m=+1931.951587676" lastFinishedPulling="2026-03-13 21:00:04.386570817 +0000 UTC m=+1935.407686708" observedRunningTime="2026-03-13 21:00:04.713504211 +0000 UTC m=+1935.734620102" watchObservedRunningTime="2026-03-13 21:00:04.713995785 +0000 UTC m=+1935.735111676" Mar 13 21:00:05 crc kubenswrapper[4790]: I0313 21:00:05.712801 4790 generic.go:334] "Generic (PLEG): container finished" podID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerID="5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1" exitCode=0 Mar 13 21:00:05 crc kubenswrapper[4790]: I0313 21:00:05.712868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerDied","Data":"5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1"} Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.070319 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.139912 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") pod \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\" (UID: \"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0\") " Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.145568 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt" (OuterVolumeSpecName: "kube-api-access-77wxt") pod "6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" (UID: "6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0"). InnerVolumeSpecName "kube-api-access-77wxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.243589 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-77wxt\" (UniqueName: \"kubernetes.io/projected/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0-kube-api-access-77wxt\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.739736 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" event={"ID":"6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0","Type":"ContainerDied","Data":"e5c5b460ceb349db78baa661bfdbefbccd67389a158be1d27ad4b75995e8252b"} Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.739776 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5c5b460ceb349db78baa661bfdbefbccd67389a158be1d27ad4b75995e8252b" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.739962 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557260-m6wtk" Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.775367 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 21:00:07 crc kubenswrapper[4790]: I0313 21:00:07.784121 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557254-62lxw"] Mar 13 21:00:09 crc kubenswrapper[4790]: I0313 21:00:09.676859 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="173eb1b0-728a-4420-bfab-ba33ae08f5eb" path="/var/lib/kubelet/pods/173eb1b0-728a-4420-bfab-ba33ae08f5eb/volumes" Mar 13 21:00:36 crc kubenswrapper[4790]: I0313 21:00:36.001639 4790 generic.go:334] "Generic (PLEG): container finished" podID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerID="c8ddee344e61f57e55bf975cc9ff728e15bf1f3150e4544252973126358814b9" exitCode=0 Mar 13 21:00:36 crc kubenswrapper[4790]: I0313 21:00:36.001752 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerDied","Data":"c8ddee344e61f57e55bf975cc9ff728e15bf1f3150e4544252973126358814b9"} Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.046576 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.056908 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-sw4k5"] Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.413054 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.549128 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") pod \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.549367 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") pod \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.549473 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") pod \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\" (UID: \"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f\") " Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.555323 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56" (OuterVolumeSpecName: "kube-api-access-hsn56") pod "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" (UID: "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f"). InnerVolumeSpecName "kube-api-access-hsn56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.579137 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" (UID: "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.581595 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory" (OuterVolumeSpecName: "inventory") pod "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" (UID: "f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.652140 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.652176 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.652186 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsn56\" (UniqueName: \"kubernetes.io/projected/f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f-kube-api-access-hsn56\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:37 crc kubenswrapper[4790]: I0313 21:00:37.670895 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="263e3744-6b98-4d91-aba2-cd28a616d9df" path="/var/lib/kubelet/pods/263e3744-6b98-4d91-aba2-cd28a616d9df/volumes" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.025177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" event={"ID":"f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f","Type":"ContainerDied","Data":"a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256"} Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.025213 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6fa19e5fb9a52e8d274e36f7e61c8bca9ed52e76cf05a06f56817ed6042a256" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.025600 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.124125 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pdxrj"] Mar 13 21:00:38 crc kubenswrapper[4790]: E0313 21:00:38.125244 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerName="collect-profiles" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125323 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerName="collect-profiles" Mar 13 21:00:38 crc kubenswrapper[4790]: E0313 21:00:38.125398 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerName="oc" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125453 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerName="oc" Mar 13 21:00:38 crc kubenswrapper[4790]: E0313 21:00:38.125521 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125603 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125864 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.125945 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="427c23ef-3e13-432b-98b4-08a6aa5b7cff" containerName="collect-profiles" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.126011 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" containerName="oc" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.126714 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.129701 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.129943 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.130164 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.130221 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.137706 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pdxrj"] Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.266302 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.266395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.266495 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.370193 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.370248 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.370338 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.379335 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.381607 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.388206 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ssh-known-hosts-edpm-deployment-pdxrj\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.443330 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:38 crc kubenswrapper[4790]: I0313 21:00:38.906819 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-pdxrj"] Mar 13 21:00:39 crc kubenswrapper[4790]: I0313 21:00:39.034842 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerStarted","Data":"5f6755c2ce51cca35693a7909f948d2dc09cc15243764bfb0a65cce83e1980ba"} Mar 13 21:00:40 crc kubenswrapper[4790]: I0313 21:00:40.044024 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerStarted","Data":"77d248b88c2d0c4526ca72adc06ae8f841c05ba15cefbf5df17827e6142b336f"} Mar 13 21:00:40 crc kubenswrapper[4790]: I0313 21:00:40.066232 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" podStartSLOduration=1.6241604170000001 podStartE2EDuration="2.066215097s" podCreationTimestamp="2026-03-13 21:00:38 +0000 UTC" firstStartedPulling="2026-03-13 21:00:38.917236088 +0000 UTC m=+1969.938351979" lastFinishedPulling="2026-03-13 21:00:39.359290768 +0000 UTC m=+1970.380406659" observedRunningTime="2026-03-13 21:00:40.063131913 +0000 UTC m=+1971.084247814" watchObservedRunningTime="2026-03-13 21:00:40.066215097 +0000 UTC m=+1971.087330988" Mar 13 21:00:46 crc kubenswrapper[4790]: I0313 21:00:46.094687 4790 generic.go:334] "Generic (PLEG): container finished" podID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerID="77d248b88c2d0c4526ca72adc06ae8f841c05ba15cefbf5df17827e6142b336f" exitCode=0 Mar 13 21:00:46 crc kubenswrapper[4790]: I0313 21:00:46.094793 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerDied","Data":"77d248b88c2d0c4526ca72adc06ae8f841c05ba15cefbf5df17827e6142b336f"} Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.533753 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.638115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") pod \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.638263 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") pod \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.638402 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") pod \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\" (UID: \"ee77aab6-b3c2-4925-a715-428a4c5e5bd9\") " Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.649920 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4" (OuterVolumeSpecName: "kube-api-access-96gt4") pod "ee77aab6-b3c2-4925-a715-428a4c5e5bd9" (UID: "ee77aab6-b3c2-4925-a715-428a4c5e5bd9"). InnerVolumeSpecName "kube-api-access-96gt4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.665516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ee77aab6-b3c2-4925-a715-428a4c5e5bd9" (UID: "ee77aab6-b3c2-4925-a715-428a4c5e5bd9"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.670742 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "ee77aab6-b3c2-4925-a715-428a4c5e5bd9" (UID: "ee77aab6-b3c2-4925-a715-428a4c5e5bd9"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.741582 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.741631 4790 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:47 crc kubenswrapper[4790]: I0313 21:00:47.741642 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96gt4\" (UniqueName: \"kubernetes.io/projected/ee77aab6-b3c2-4925-a715-428a4c5e5bd9-kube-api-access-96gt4\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.112162 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" event={"ID":"ee77aab6-b3c2-4925-a715-428a4c5e5bd9","Type":"ContainerDied","Data":"5f6755c2ce51cca35693a7909f948d2dc09cc15243764bfb0a65cce83e1980ba"} Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.112202 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f6755c2ce51cca35693a7909f948d2dc09cc15243764bfb0a65cce83e1980ba" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.112253 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-pdxrj" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.177173 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8"] Mar 13 21:00:48 crc kubenswrapper[4790]: E0313 21:00:48.177755 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerName="ssh-known-hosts-edpm-deployment" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.177783 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerName="ssh-known-hosts-edpm-deployment" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.178045 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee77aab6-b3c2-4925-a715-428a4c5e5bd9" containerName="ssh-known-hosts-edpm-deployment" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.178878 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.182560 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.182818 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.182892 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.183291 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.191880 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8"] Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.353526 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.353743 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.354076 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.456423 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.456813 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.456942 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.463003 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.464410 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.473062 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-rhtx8\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:48 crc kubenswrapper[4790]: I0313 21:00:48.505261 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:49 crc kubenswrapper[4790]: I0313 21:00:49.019208 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8"] Mar 13 21:00:49 crc kubenswrapper[4790]: I0313 21:00:49.121198 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerStarted","Data":"bdebea233f23ef8d949a327f2db14c145b93e2ed9f847d431dd693279b443afa"} Mar 13 21:00:49 crc kubenswrapper[4790]: I0313 21:00:49.748116 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:00:50 crc kubenswrapper[4790]: I0313 21:00:50.133537 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerStarted","Data":"3dbda74f382829d90b8f3ab821618c380a983071e325c3af0881d8d1ec980d7e"} Mar 13 21:00:50 crc kubenswrapper[4790]: I0313 21:00:50.159627 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" podStartSLOduration=1.4456798530000001 podStartE2EDuration="2.159595113s" podCreationTimestamp="2026-03-13 21:00:48 +0000 UTC" firstStartedPulling="2026-03-13 21:00:49.031862945 +0000 UTC m=+1980.052978836" lastFinishedPulling="2026-03-13 21:00:49.745778205 +0000 UTC m=+1980.766894096" observedRunningTime="2026-03-13 21:00:50.147603564 +0000 UTC m=+1981.168719455" watchObservedRunningTime="2026-03-13 21:00:50.159595113 +0000 UTC m=+1981.180711034" Mar 13 21:00:56 crc kubenswrapper[4790]: I0313 21:00:56.068446 4790 scope.go:117] "RemoveContainer" containerID="1354228427a90e6609d9b0170fc1b61342fc6ff24449709c9abd0f642ea90a66" Mar 13 21:00:56 crc kubenswrapper[4790]: I0313 21:00:56.108354 4790 scope.go:117] "RemoveContainer" containerID="4f8e347d99704add2e53a060aced55cc22039113443643e0c09d3500a1b42570" Mar 13 21:00:58 crc kubenswrapper[4790]: I0313 21:00:58.225238 4790 generic.go:334] "Generic (PLEG): container finished" podID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerID="3dbda74f382829d90b8f3ab821618c380a983071e325c3af0881d8d1ec980d7e" exitCode=0 Mar 13 21:00:58 crc kubenswrapper[4790]: I0313 21:00:58.225449 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerDied","Data":"3dbda74f382829d90b8f3ab821618c380a983071e325c3af0881d8d1ec980d7e"} Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.656520 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.670313 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") pod \"d19bd67c-441b-4813-8cc3-07c8cf446e42\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.670750 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") pod \"d19bd67c-441b-4813-8cc3-07c8cf446e42\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.670904 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") pod \"d19bd67c-441b-4813-8cc3-07c8cf446e42\" (UID: \"d19bd67c-441b-4813-8cc3-07c8cf446e42\") " Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.687304 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr" (OuterVolumeSpecName: "kube-api-access-xx9nr") pod "d19bd67c-441b-4813-8cc3-07c8cf446e42" (UID: "d19bd67c-441b-4813-8cc3-07c8cf446e42"). InnerVolumeSpecName "kube-api-access-xx9nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.713536 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory" (OuterVolumeSpecName: "inventory") pod "d19bd67c-441b-4813-8cc3-07c8cf446e42" (UID: "d19bd67c-441b-4813-8cc3-07c8cf446e42"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.716673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d19bd67c-441b-4813-8cc3-07c8cf446e42" (UID: "d19bd67c-441b-4813-8cc3-07c8cf446e42"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.773740 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.773774 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx9nr\" (UniqueName: \"kubernetes.io/projected/d19bd67c-441b-4813-8cc3-07c8cf446e42-kube-api-access-xx9nr\") on node \"crc\" DevicePath \"\"" Mar 13 21:00:59 crc kubenswrapper[4790]: I0313 21:00:59.773784 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d19bd67c-441b-4813-8cc3-07c8cf446e42-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.141845 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29557261-5pp9q"] Mar 13 21:01:00 crc kubenswrapper[4790]: E0313 21:01:00.142660 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.142684 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.142867 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d19bd67c-441b-4813-8cc3-07c8cf446e42" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.144481 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.152885 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557261-5pp9q"] Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180070 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180270 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.180406 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.264914 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" event={"ID":"d19bd67c-441b-4813-8cc3-07c8cf446e42","Type":"ContainerDied","Data":"bdebea233f23ef8d949a327f2db14c145b93e2ed9f847d431dd693279b443afa"} Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.264952 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdebea233f23ef8d949a327f2db14c145b93e2ed9f847d431dd693279b443afa" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.265016 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-rhtx8" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282223 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282298 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.282717 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.294857 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.295808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.300252 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.314269 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"keystone-cron-29557261-5pp9q\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.395904 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt"] Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.397197 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.403764 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.404050 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.404227 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.404439 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.414951 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt"] Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.464912 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.588703 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.589025 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.589095 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.691088 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.691143 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.691206 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.700527 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.712492 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.713610 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.720043 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:00 crc kubenswrapper[4790]: I0313 21:01:00.922092 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29557261-5pp9q"] Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.250679 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt"] Mar 13 21:01:01 crc kubenswrapper[4790]: W0313 21:01:01.252710 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cb0d614_f5d9_4862_8059_ad323eec6c59.slice/crio-285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423 WatchSource:0}: Error finding container 285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423: Status 404 returned error can't find the container with id 285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423 Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.285265 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerStarted","Data":"983aa109c38a604ee34bc992ac30687ce8449ee6da1a3d0137206237482d8e8f"} Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.285310 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerStarted","Data":"f535c2e9bfbdd2df1c7cf36740fcc50bfa9a5f7bfe93e05fcf8c23101a3e8eec"} Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.288322 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerStarted","Data":"285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423"} Mar 13 21:01:01 crc kubenswrapper[4790]: I0313 21:01:01.313872 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29557261-5pp9q" podStartSLOduration=1.3138463439999999 podStartE2EDuration="1.313846344s" podCreationTimestamp="2026-03-13 21:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 21:01:01.30015951 +0000 UTC m=+1992.321275411" watchObservedRunningTime="2026-03-13 21:01:01.313846344 +0000 UTC m=+1992.334962235" Mar 13 21:01:02 crc kubenswrapper[4790]: I0313 21:01:02.299018 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerStarted","Data":"54eadb04d171d0b6ed335700084015547ac87a7b17db8895122b9015adb30fc3"} Mar 13 21:01:02 crc kubenswrapper[4790]: I0313 21:01:02.328962 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" podStartSLOduration=1.870610878 podStartE2EDuration="2.328944926s" podCreationTimestamp="2026-03-13 21:01:00 +0000 UTC" firstStartedPulling="2026-03-13 21:01:01.260526013 +0000 UTC m=+1992.281641894" lastFinishedPulling="2026-03-13 21:01:01.718860051 +0000 UTC m=+1992.739975942" observedRunningTime="2026-03-13 21:01:02.326598502 +0000 UTC m=+1993.347714393" watchObservedRunningTime="2026-03-13 21:01:02.328944926 +0000 UTC m=+1993.350060817" Mar 13 21:01:03 crc kubenswrapper[4790]: I0313 21:01:03.308906 4790 generic.go:334] "Generic (PLEG): container finished" podID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerID="983aa109c38a604ee34bc992ac30687ce8449ee6da1a3d0137206237482d8e8f" exitCode=0 Mar 13 21:01:03 crc kubenswrapper[4790]: I0313 21:01:03.309157 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerDied","Data":"983aa109c38a604ee34bc992ac30687ce8449ee6da1a3d0137206237482d8e8f"} Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.673908 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776182 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776279 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776333 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.776399 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") pod \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\" (UID: \"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648\") " Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.782238 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b" (OuterVolumeSpecName: "kube-api-access-x885b") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "kube-api-access-x885b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.782288 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.805363 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.833215 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data" (OuterVolumeSpecName: "config-data") pod "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" (UID: "65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878103 4790 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878145 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878155 4790 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:04 crc kubenswrapper[4790]: I0313 21:01:04.878167 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x885b\" (UniqueName: \"kubernetes.io/projected/65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648-kube-api-access-x885b\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:05 crc kubenswrapper[4790]: I0313 21:01:05.326177 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29557261-5pp9q" event={"ID":"65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648","Type":"ContainerDied","Data":"f535c2e9bfbdd2df1c7cf36740fcc50bfa9a5f7bfe93e05fcf8c23101a3e8eec"} Mar 13 21:01:05 crc kubenswrapper[4790]: I0313 21:01:05.326219 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f535c2e9bfbdd2df1c7cf36740fcc50bfa9a5f7bfe93e05fcf8c23101a3e8eec" Mar 13 21:01:05 crc kubenswrapper[4790]: I0313 21:01:05.326270 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29557261-5pp9q" Mar 13 21:01:11 crc kubenswrapper[4790]: I0313 21:01:11.375423 4790 generic.go:334] "Generic (PLEG): container finished" podID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerID="54eadb04d171d0b6ed335700084015547ac87a7b17db8895122b9015adb30fc3" exitCode=0 Mar 13 21:01:11 crc kubenswrapper[4790]: I0313 21:01:11.375494 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerDied","Data":"54eadb04d171d0b6ed335700084015547ac87a7b17db8895122b9015adb30fc3"} Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.783868 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.853604 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") pod \"7cb0d614-f5d9-4862-8059-ad323eec6c59\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.853807 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") pod \"7cb0d614-f5d9-4862-8059-ad323eec6c59\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.853872 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") pod \"7cb0d614-f5d9-4862-8059-ad323eec6c59\" (UID: \"7cb0d614-f5d9-4862-8059-ad323eec6c59\") " Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.859775 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr" (OuterVolumeSpecName: "kube-api-access-c6ljr") pod "7cb0d614-f5d9-4862-8059-ad323eec6c59" (UID: "7cb0d614-f5d9-4862-8059-ad323eec6c59"). InnerVolumeSpecName "kube-api-access-c6ljr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.882466 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory" (OuterVolumeSpecName: "inventory") pod "7cb0d614-f5d9-4862-8059-ad323eec6c59" (UID: "7cb0d614-f5d9-4862-8059-ad323eec6c59"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.888415 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7cb0d614-f5d9-4862-8059-ad323eec6c59" (UID: "7cb0d614-f5d9-4862-8059-ad323eec6c59"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.958272 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.958310 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7cb0d614-f5d9-4862-8059-ad323eec6c59-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:12 crc kubenswrapper[4790]: I0313 21:01:12.958321 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6ljr\" (UniqueName: \"kubernetes.io/projected/7cb0d614-f5d9-4862-8059-ad323eec6c59-kube-api-access-c6ljr\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.392996 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" event={"ID":"7cb0d614-f5d9-4862-8059-ad323eec6c59","Type":"ContainerDied","Data":"285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423"} Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.393036 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="285d58131654439d3af5994aa7631a26ae9f04f1b609a18bfb283c53325db423" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.393054 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471149 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5"] Mar 13 21:01:13 crc kubenswrapper[4790]: E0313 21:01:13.471527 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471543 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: E0313 21:01:13.471578 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerName="keystone-cron" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471586 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerName="keystone-cron" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471771 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648" containerName="keystone-cron" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.471787 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cb0d614-f5d9-4862-8059-ad323eec6c59" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.472369 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.475271 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.475655 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.475779 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.476738 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.476880 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.476817 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.477552 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.477643 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.494674 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5"] Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571239 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571323 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571373 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571434 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571481 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571583 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571622 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571680 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571711 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571739 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571777 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.571861 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.673922 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.673961 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.673991 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674015 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674053 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674084 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674137 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674163 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674183 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674231 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674249 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674275 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.674312 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.679036 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.681960 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.681969 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682095 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682208 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682420 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682500 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.682514 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.683214 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.683844 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.688761 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.690774 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.692484 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.694998 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-mznb5\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:13 crc kubenswrapper[4790]: I0313 21:01:13.790685 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:14 crc kubenswrapper[4790]: I0313 21:01:14.300976 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5"] Mar 13 21:01:14 crc kubenswrapper[4790]: I0313 21:01:14.403362 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerStarted","Data":"426f65e9c26d39b596dc573505d7b808196f8c14f2fa3cc5ae9289099b5aa2e9"} Mar 13 21:01:15 crc kubenswrapper[4790]: I0313 21:01:15.413832 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerStarted","Data":"33421c8bca23196e37ea69a9dbe3facb2e231c9ca13be229cc160b45432ee770"} Mar 13 21:01:15 crc kubenswrapper[4790]: I0313 21:01:15.439839 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" podStartSLOduration=2.007804467 podStartE2EDuration="2.439820273s" podCreationTimestamp="2026-03-13 21:01:13 +0000 UTC" firstStartedPulling="2026-03-13 21:01:14.304247172 +0000 UTC m=+2005.325363053" lastFinishedPulling="2026-03-13 21:01:14.736262968 +0000 UTC m=+2005.757378859" observedRunningTime="2026-03-13 21:01:15.434124147 +0000 UTC m=+2006.455240038" watchObservedRunningTime="2026-03-13 21:01:15.439820273 +0000 UTC m=+2006.460936164" Mar 13 21:01:47 crc kubenswrapper[4790]: I0313 21:01:47.678741 4790 generic.go:334] "Generic (PLEG): container finished" podID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerID="33421c8bca23196e37ea69a9dbe3facb2e231c9ca13be229cc160b45432ee770" exitCode=0 Mar 13 21:01:47 crc kubenswrapper[4790]: I0313 21:01:47.678833 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerDied","Data":"33421c8bca23196e37ea69a9dbe3facb2e231c9ca13be229cc160b45432ee770"} Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.081719 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.232541 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.232933 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.232968 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233721 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233789 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233854 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233889 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.233924 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.235862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.236350 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.236539 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.236652 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\" (UID: \"77bc94c9-b530-4ea9-8c94-0d5a985fb930\") " Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241408 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241513 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241692 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt" (OuterVolumeSpecName: "kube-api-access-qs5kt") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "kube-api-access-qs5kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241738 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241883 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.241951 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.242888 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.244473 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.244485 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.244769 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.245080 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.246740 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.272817 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.275339 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory" (OuterVolumeSpecName: "inventory") pod "77bc94c9-b530-4ea9-8c94-0d5a985fb930" (UID: "77bc94c9-b530-4ea9-8c94-0d5a985fb930"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339684 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339723 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339734 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339743 4790 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339754 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs5kt\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-kube-api-access-qs5kt\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339762 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339773 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339783 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339793 4790 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339801 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339809 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339818 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/77bc94c9-b530-4ea9-8c94-0d5a985fb930-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339828 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.339836 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77bc94c9-b530-4ea9-8c94-0d5a985fb930-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.697719 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" event={"ID":"77bc94c9-b530-4ea9-8c94-0d5a985fb930","Type":"ContainerDied","Data":"426f65e9c26d39b596dc573505d7b808196f8c14f2fa3cc5ae9289099b5aa2e9"} Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.697762 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-mznb5" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.697772 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="426f65e9c26d39b596dc573505d7b808196f8c14f2fa3cc5ae9289099b5aa2e9" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.789086 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq"] Mar 13 21:01:49 crc kubenswrapper[4790]: E0313 21:01:49.789481 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.789499 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.789720 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="77bc94c9-b530-4ea9-8c94-0d5a985fb930" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.790300 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.794818 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795211 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795400 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795666 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.795975 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.803010 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq"] Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.951842 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952470 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952589 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:49 crc kubenswrapper[4790]: I0313 21:01:49.952704 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054109 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054272 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054315 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054419 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.054461 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.055250 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.062322 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.062425 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.062494 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.074707 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-7d9cq\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.121907 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.640240 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq"] Mar 13 21:01:50 crc kubenswrapper[4790]: I0313 21:01:50.708888 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerStarted","Data":"e338edf26926f4ba8724c786baa1f3ad6cf40efaff64a89683a9459869e28ac9"} Mar 13 21:01:51 crc kubenswrapper[4790]: I0313 21:01:51.719192 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerStarted","Data":"f70c8e8a59a2069c4d31e1d743181c1e04d84701a48632b52e0eea3913ba3e41"} Mar 13 21:01:51 crc kubenswrapper[4790]: I0313 21:01:51.748344 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" podStartSLOduration=2.295100786 podStartE2EDuration="2.748322754s" podCreationTimestamp="2026-03-13 21:01:49 +0000 UTC" firstStartedPulling="2026-03-13 21:01:50.642903327 +0000 UTC m=+2041.664019218" lastFinishedPulling="2026-03-13 21:01:51.096125295 +0000 UTC m=+2042.117241186" observedRunningTime="2026-03-13 21:01:51.739680967 +0000 UTC m=+2042.760796858" watchObservedRunningTime="2026-03-13 21:01:51.748322754 +0000 UTC m=+2042.769438645" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.141992 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.143745 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.145770 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.145837 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.146032 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.150934 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.250757 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"auto-csr-approver-29557262-wdkmw\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.353110 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"auto-csr-approver-29557262-wdkmw\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.378016 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"auto-csr-approver-29557262-wdkmw\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.465873 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:00 crc kubenswrapper[4790]: I0313 21:02:00.903100 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:02:01 crc kubenswrapper[4790]: I0313 21:02:01.796388 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" event={"ID":"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0","Type":"ContainerStarted","Data":"5a26ab7c7362ccce69b1823ca5d7bdfbecdeeb4170dee6dc1ab0d459ae997f3b"} Mar 13 21:02:02 crc kubenswrapper[4790]: I0313 21:02:02.806040 4790 generic.go:334] "Generic (PLEG): container finished" podID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerID="c9fc9237e156eb0becb6b2dc2279bf5dc16eec046e67e33454f05890e75163e2" exitCode=0 Mar 13 21:02:02 crc kubenswrapper[4790]: I0313 21:02:02.806107 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" event={"ID":"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0","Type":"ContainerDied","Data":"c9fc9237e156eb0becb6b2dc2279bf5dc16eec046e67e33454f05890e75163e2"} Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.142972 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.236920 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") pod \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\" (UID: \"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0\") " Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.246794 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h" (OuterVolumeSpecName: "kube-api-access-7mm5h") pod "5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" (UID: "5b5219d2-3afd-4a8d-ab26-3102b6dee3b0"). InnerVolumeSpecName "kube-api-access-7mm5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.339618 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mm5h\" (UniqueName: \"kubernetes.io/projected/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0-kube-api-access-7mm5h\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.824544 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" event={"ID":"5b5219d2-3afd-4a8d-ab26-3102b6dee3b0","Type":"ContainerDied","Data":"5a26ab7c7362ccce69b1823ca5d7bdfbecdeeb4170dee6dc1ab0d459ae997f3b"} Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.824863 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a26ab7c7362ccce69b1823ca5d7bdfbecdeeb4170dee6dc1ab0d459ae997f3b" Mar 13 21:02:04 crc kubenswrapper[4790]: I0313 21:02:04.824649 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557262-wdkmw" Mar 13 21:02:05 crc kubenswrapper[4790]: I0313 21:02:05.211945 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 21:02:05 crc kubenswrapper[4790]: I0313 21:02:05.219805 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557256-v26h5"] Mar 13 21:02:05 crc kubenswrapper[4790]: I0313 21:02:05.674563 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ffc58ad-c12d-4165-bc92-1e948aa14c42" path="/var/lib/kubelet/pods/4ffc58ad-c12d-4165-bc92-1e948aa14c42/volumes" Mar 13 21:02:14 crc kubenswrapper[4790]: I0313 21:02:14.015479 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:02:14 crc kubenswrapper[4790]: I0313 21:02:14.015963 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:02:44 crc kubenswrapper[4790]: I0313 21:02:44.015487 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:02:44 crc kubenswrapper[4790]: I0313 21:02:44.016463 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:02:48 crc kubenswrapper[4790]: I0313 21:02:48.222930 4790 generic.go:334] "Generic (PLEG): container finished" podID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerID="f70c8e8a59a2069c4d31e1d743181c1e04d84701a48632b52e0eea3913ba3e41" exitCode=0 Mar 13 21:02:48 crc kubenswrapper[4790]: I0313 21:02:48.223020 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerDied","Data":"f70c8e8a59a2069c4d31e1d743181c1e04d84701a48632b52e0eea3913ba3e41"} Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.647977 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.728737 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.728808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.728852 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.729069 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.729108 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") pod \"fe27e2d5-7108-4d49-99bb-15208f36cff7\" (UID: \"fe27e2d5-7108-4d49-99bb-15208f36cff7\") " Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.735777 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.737792 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh" (OuterVolumeSpecName: "kube-api-access-57bxh") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "kube-api-access-57bxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.757542 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.765818 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.774093 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory" (OuterVolumeSpecName: "inventory") pod "fe27e2d5-7108-4d49-99bb-15208f36cff7" (UID: "fe27e2d5-7108-4d49-99bb-15208f36cff7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832421 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57bxh\" (UniqueName: \"kubernetes.io/projected/fe27e2d5-7108-4d49-99bb-15208f36cff7-kube-api-access-57bxh\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832501 4790 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832515 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832531 4790 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/fe27e2d5-7108-4d49-99bb-15208f36cff7-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:49 crc kubenswrapper[4790]: I0313 21:02:49.832544 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fe27e2d5-7108-4d49-99bb-15208f36cff7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.245733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" event={"ID":"fe27e2d5-7108-4d49-99bb-15208f36cff7","Type":"ContainerDied","Data":"e338edf26926f4ba8724c786baa1f3ad6cf40efaff64a89683a9459869e28ac9"} Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.246065 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e338edf26926f4ba8724c786baa1f3ad6cf40efaff64a89683a9459869e28ac9" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.245777 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-7d9cq" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.443499 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4"] Mar 13 21:02:50 crc kubenswrapper[4790]: E0313 21:02:50.443856 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.443872 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: E0313 21:02:50.443887 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerName="oc" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.443894 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerName="oc" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.444084 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" containerName="oc" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.444101 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe27e2d5-7108-4d49-99bb-15208f36cff7" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.444759 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.447715 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.447738 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.447744 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.448312 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.449609 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.456635 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4"] Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.512693 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546350 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546467 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546491 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546525 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546554 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.546574 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648261 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648496 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648590 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648622 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648671 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.648713 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.652818 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.653003 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.655510 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.658310 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.659346 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.666578 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:50 crc kubenswrapper[4790]: I0313 21:02:50.823700 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:02:51 crc kubenswrapper[4790]: I0313 21:02:51.367594 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4"] Mar 13 21:02:52 crc kubenswrapper[4790]: I0313 21:02:52.266138 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerStarted","Data":"e12be54f325e5426cb892a229deff495e48abdb2e1c4e0dad6fe222c62342b3b"} Mar 13 21:02:52 crc kubenswrapper[4790]: I0313 21:02:52.266199 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerStarted","Data":"0dafa60c26fe92544b98d93c0ab98afa5193df785bac54fff3b546f1a811fd97"} Mar 13 21:02:52 crc kubenswrapper[4790]: I0313 21:02:52.314435 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" podStartSLOduration=1.857272823 podStartE2EDuration="2.314410208s" podCreationTimestamp="2026-03-13 21:02:50 +0000 UTC" firstStartedPulling="2026-03-13 21:02:51.374798044 +0000 UTC m=+2102.395913935" lastFinishedPulling="2026-03-13 21:02:51.831935429 +0000 UTC m=+2102.853051320" observedRunningTime="2026-03-13 21:02:52.307414376 +0000 UTC m=+2103.328530277" watchObservedRunningTime="2026-03-13 21:02:52.314410208 +0000 UTC m=+2103.335526099" Mar 13 21:02:56 crc kubenswrapper[4790]: I0313 21:02:56.828507 4790 scope.go:117] "RemoveContainer" containerID="089c34632a3aa85bf67d8f16facd625e77441bd26bee098a9592424a45b9e093" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.015959 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.016581 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.016650 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.017513 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:03:14 crc kubenswrapper[4790]: I0313 21:03:14.017574 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4" gracePeriod=600 Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.016944 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4" exitCode=0 Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.017009 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4"} Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.018665 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5"} Mar 13 21:03:15 crc kubenswrapper[4790]: I0313 21:03:15.018719 4790 scope.go:117] "RemoveContainer" containerID="ba26f3b945b59e45f7222a641868a052291abcb4cb646f3f210879bd2861783e" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.208543 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.211083 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.236646 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.339245 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.339559 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.339824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442012 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442124 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442236 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442516 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.442655 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.476193 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"redhat-operators-2g4x4\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:17 crc kubenswrapper[4790]: I0313 21:03:17.587706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:18 crc kubenswrapper[4790]: I0313 21:03:18.076401 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:19 crc kubenswrapper[4790]: I0313 21:03:19.053067 4790 generic.go:334] "Generic (PLEG): container finished" podID="22852680-9cbb-4ca0-817d-d8391e019c99" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" exitCode=0 Mar 13 21:03:19 crc kubenswrapper[4790]: I0313 21:03:19.053268 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84"} Mar 13 21:03:19 crc kubenswrapper[4790]: I0313 21:03:19.053610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerStarted","Data":"ab4c725566fad1ec1f3ea2b58848dc912abb41b5c8e1dbefa92e0bd281a81a63"} Mar 13 21:03:20 crc kubenswrapper[4790]: I0313 21:03:20.063187 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerStarted","Data":"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f"} Mar 13 21:03:21 crc kubenswrapper[4790]: I0313 21:03:21.074165 4790 generic.go:334] "Generic (PLEG): container finished" podID="22852680-9cbb-4ca0-817d-d8391e019c99" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" exitCode=0 Mar 13 21:03:21 crc kubenswrapper[4790]: I0313 21:03:21.074478 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f"} Mar 13 21:03:23 crc kubenswrapper[4790]: I0313 21:03:23.094691 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerStarted","Data":"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56"} Mar 13 21:03:23 crc kubenswrapper[4790]: I0313 21:03:23.120112 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2g4x4" podStartSLOduration=3.328190068 podStartE2EDuration="6.12008583s" podCreationTimestamp="2026-03-13 21:03:17 +0000 UTC" firstStartedPulling="2026-03-13 21:03:19.054964755 +0000 UTC m=+2130.076080646" lastFinishedPulling="2026-03-13 21:03:21.846860507 +0000 UTC m=+2132.867976408" observedRunningTime="2026-03-13 21:03:23.11019667 +0000 UTC m=+2134.131312561" watchObservedRunningTime="2026-03-13 21:03:23.12008583 +0000 UTC m=+2134.141201721" Mar 13 21:03:27 crc kubenswrapper[4790]: I0313 21:03:27.588746 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:27 crc kubenswrapper[4790]: I0313 21:03:27.589095 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:28 crc kubenswrapper[4790]: I0313 21:03:28.646699 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2g4x4" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" probeResult="failure" output=< Mar 13 21:03:28 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:03:28 crc kubenswrapper[4790]: > Mar 13 21:03:37 crc kubenswrapper[4790]: I0313 21:03:37.650402 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:37 crc kubenswrapper[4790]: I0313 21:03:37.698665 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:37 crc kubenswrapper[4790]: I0313 21:03:37.891014 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.247744 4790 generic.go:334] "Generic (PLEG): container finished" podID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerID="e12be54f325e5426cb892a229deff495e48abdb2e1c4e0dad6fe222c62342b3b" exitCode=0 Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.247816 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerDied","Data":"e12be54f325e5426cb892a229deff495e48abdb2e1c4e0dad6fe222c62342b3b"} Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.248371 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2g4x4" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" containerID="cri-o://5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" gracePeriod=2 Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.715091 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.843916 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") pod \"22852680-9cbb-4ca0-817d-d8391e019c99\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.844168 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") pod \"22852680-9cbb-4ca0-817d-d8391e019c99\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.844861 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") pod \"22852680-9cbb-4ca0-817d-d8391e019c99\" (UID: \"22852680-9cbb-4ca0-817d-d8391e019c99\") " Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.845918 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities" (OuterVolumeSpecName: "utilities") pod "22852680-9cbb-4ca0-817d-d8391e019c99" (UID: "22852680-9cbb-4ca0-817d-d8391e019c99"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.847340 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.854854 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb" (OuterVolumeSpecName: "kube-api-access-s9qkb") pod "22852680-9cbb-4ca0-817d-d8391e019c99" (UID: "22852680-9cbb-4ca0-817d-d8391e019c99"). InnerVolumeSpecName "kube-api-access-s9qkb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.949978 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9qkb\" (UniqueName: \"kubernetes.io/projected/22852680-9cbb-4ca0-817d-d8391e019c99-kube-api-access-s9qkb\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:39 crc kubenswrapper[4790]: I0313 21:03:39.976780 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22852680-9cbb-4ca0-817d-d8391e019c99" (UID: "22852680-9cbb-4ca0-817d-d8391e019c99"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.051587 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22852680-9cbb-4ca0-817d-d8391e019c99-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262297 4790 generic.go:334] "Generic (PLEG): container finished" podID="22852680-9cbb-4ca0-817d-d8391e019c99" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" exitCode=0 Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262357 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2g4x4" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262400 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56"} Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262704 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2g4x4" event={"ID":"22852680-9cbb-4ca0-817d-d8391e019c99","Type":"ContainerDied","Data":"ab4c725566fad1ec1f3ea2b58848dc912abb41b5c8e1dbefa92e0bd281a81a63"} Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.262734 4790 scope.go:117] "RemoveContainer" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.294338 4790 scope.go:117] "RemoveContainer" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.307982 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.318876 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2g4x4"] Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.331426 4790 scope.go:117] "RemoveContainer" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.369526 4790 scope.go:117] "RemoveContainer" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" Mar 13 21:03:40 crc kubenswrapper[4790]: E0313 21:03:40.370286 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56\": container with ID starting with 5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56 not found: ID does not exist" containerID="5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.370346 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56"} err="failed to get container status \"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56\": rpc error: code = NotFound desc = could not find container \"5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56\": container with ID starting with 5f5db77a4a9333641a43c61580d7bd819737c1c6af2e72bc28a86fb63809ba56 not found: ID does not exist" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.370390 4790 scope.go:117] "RemoveContainer" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" Mar 13 21:03:40 crc kubenswrapper[4790]: E0313 21:03:40.371507 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f\": container with ID starting with abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f not found: ID does not exist" containerID="abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.371533 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f"} err="failed to get container status \"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f\": rpc error: code = NotFound desc = could not find container \"abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f\": container with ID starting with abab4a6bd1c0b2d5fb3ce10efef2b9ca332e0bb9e702062917ee2950e907d90f not found: ID does not exist" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.371551 4790 scope.go:117] "RemoveContainer" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" Mar 13 21:03:40 crc kubenswrapper[4790]: E0313 21:03:40.372056 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84\": container with ID starting with 3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84 not found: ID does not exist" containerID="3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.372082 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84"} err="failed to get container status \"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84\": rpc error: code = NotFound desc = could not find container \"3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84\": container with ID starting with 3f653f15de75632b9c4bc0a14dc99e71623fdee0ef1c06898871ea922e049c84 not found: ID does not exist" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.646741 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765398 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765504 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765572 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765641 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765690 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.765723 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") pod \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\" (UID: \"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0\") " Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.770145 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr" (OuterVolumeSpecName: "kube-api-access-2x9nr") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "kube-api-access-2x9nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.771198 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.798848 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory" (OuterVolumeSpecName: "inventory") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.801723 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.808536 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.809694 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" (UID: "944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.869937 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.869976 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.869990 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.870004 4790 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.870017 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2x9nr\" (UniqueName: \"kubernetes.io/projected/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-kube-api-access-2x9nr\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:40 crc kubenswrapper[4790]: I0313 21:03:40.870030 4790 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.272106 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" event={"ID":"944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0","Type":"ContainerDied","Data":"0dafa60c26fe92544b98d93c0ab98afa5193df785bac54fff3b546f1a811fd97"} Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.273050 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dafa60c26fe92544b98d93c0ab98afa5193df785bac54fff3b546f1a811fd97" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.272119 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356230 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx"] Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356730 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-utilities" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356751 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-utilities" Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356784 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-content" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356794 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="extract-content" Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356817 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356826 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" Mar 13 21:03:41 crc kubenswrapper[4790]: E0313 21:03:41.356841 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.356850 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.357079 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.357106 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" containerName="registry-server" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.357909 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.359817 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.359956 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.363624 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.363628 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.364085 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.365753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx"] Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479552 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479627 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479658 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479686 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.479751 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.580858 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.580973 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.581025 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.581056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.581091 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.584814 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.585273 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.586329 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.590972 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.597534 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.676474 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22852680-9cbb-4ca0-817d-d8391e019c99" path="/var/lib/kubelet/pods/22852680-9cbb-4ca0-817d-d8391e019c99/volumes" Mar 13 21:03:41 crc kubenswrapper[4790]: I0313 21:03:41.681827 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:03:42 crc kubenswrapper[4790]: I0313 21:03:42.188185 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx"] Mar 13 21:03:42 crc kubenswrapper[4790]: I0313 21:03:42.283311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerStarted","Data":"750819b8447adbdcf460745d4cc408a88dcc52443bc7524ebb6bbcda342e2ca3"} Mar 13 21:03:43 crc kubenswrapper[4790]: I0313 21:03:43.292863 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerStarted","Data":"2a4b7cacb6bb56397aa8e80bce91be52e687845b109945911184f15eb741cc40"} Mar 13 21:03:43 crc kubenswrapper[4790]: I0313 21:03:43.306790 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" podStartSLOduration=1.8644482 podStartE2EDuration="2.306766418s" podCreationTimestamp="2026-03-13 21:03:41 +0000 UTC" firstStartedPulling="2026-03-13 21:03:42.179675029 +0000 UTC m=+2153.200790920" lastFinishedPulling="2026-03-13 21:03:42.621993247 +0000 UTC m=+2153.643109138" observedRunningTime="2026-03-13 21:03:43.305620097 +0000 UTC m=+2154.326735988" watchObservedRunningTime="2026-03-13 21:03:43.306766418 +0000 UTC m=+2154.327882309" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.151661 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.153914 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.157308 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.157360 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.157694 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.160361 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.305019 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"auto-csr-approver-29557264-b5j85\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.406878 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"auto-csr-approver-29557264-b5j85\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.430990 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"auto-csr-approver-29557264-b5j85\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.499066 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:00 crc kubenswrapper[4790]: I0313 21:04:00.919041 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:04:00 crc kubenswrapper[4790]: W0313 21:04:00.921658 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33811d20_0fb8_4b06_a9dd_d2488b19d7b9.slice/crio-5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06 WatchSource:0}: Error finding container 5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06: Status 404 returned error can't find the container with id 5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06 Mar 13 21:04:01 crc kubenswrapper[4790]: I0313 21:04:01.454358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557264-b5j85" event={"ID":"33811d20-0fb8-4b06-a9dd-d2488b19d7b9","Type":"ContainerStarted","Data":"5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06"} Mar 13 21:04:02 crc kubenswrapper[4790]: I0313 21:04:02.465321 4790 generic.go:334] "Generic (PLEG): container finished" podID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerID="3a443bd9f4b8d1df7af93baf309b6b85a45139407ed6e8e7a9df32fd174d2a54" exitCode=0 Mar 13 21:04:02 crc kubenswrapper[4790]: I0313 21:04:02.465411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557264-b5j85" event={"ID":"33811d20-0fb8-4b06-a9dd-d2488b19d7b9","Type":"ContainerDied","Data":"3a443bd9f4b8d1df7af93baf309b6b85a45139407ed6e8e7a9df32fd174d2a54"} Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.761991 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.778716 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") pod \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\" (UID: \"33811d20-0fb8-4b06-a9dd-d2488b19d7b9\") " Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.783993 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5" (OuterVolumeSpecName: "kube-api-access-69tg5") pod "33811d20-0fb8-4b06-a9dd-d2488b19d7b9" (UID: "33811d20-0fb8-4b06-a9dd-d2488b19d7b9"). InnerVolumeSpecName "kube-api-access-69tg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:04:03 crc kubenswrapper[4790]: I0313 21:04:03.881365 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69tg5\" (UniqueName: \"kubernetes.io/projected/33811d20-0fb8-4b06-a9dd-d2488b19d7b9-kube-api-access-69tg5\") on node \"crc\" DevicePath \"\"" Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.488579 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557264-b5j85" event={"ID":"33811d20-0fb8-4b06-a9dd-d2488b19d7b9","Type":"ContainerDied","Data":"5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06"} Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.488624 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bbe699a2d0f4ccd103ed5fc014ec0d87203e77ab981483e5c6e0a7700061b06" Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.488654 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557264-b5j85" Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.827650 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 21:04:04 crc kubenswrapper[4790]: I0313 21:04:04.837331 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557258-gqmrr"] Mar 13 21:04:05 crc kubenswrapper[4790]: I0313 21:04:05.672390 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7706813b-e8e7-4b17-ba18-993c121eed66" path="/var/lib/kubelet/pods/7706813b-e8e7-4b17-ba18-993c121eed66/volumes" Mar 13 21:04:56 crc kubenswrapper[4790]: I0313 21:04:56.942319 4790 scope.go:117] "RemoveContainer" containerID="98d6a341587e40eeb366a4b8a2eab51c3ea58fa67b5db767f9e2261febd34d64" Mar 13 21:05:02 crc kubenswrapper[4790]: I0313 21:05:02.997837 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:02 crc kubenswrapper[4790]: E0313 21:05:02.998903 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerName="oc" Mar 13 21:05:02 crc kubenswrapper[4790]: I0313 21:05:02.998921 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerName="oc" Mar 13 21:05:02 crc kubenswrapper[4790]: I0313 21:05:02.999149 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" containerName="oc" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.001033 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.009345 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.170176 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.170632 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.170687 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272056 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272099 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272204 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272911 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.272906 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.294585 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"redhat-marketplace-88dvw\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.326467 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:03 crc kubenswrapper[4790]: I0313 21:05:03.787876 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.055805 4790 generic.go:334] "Generic (PLEG): container finished" podID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" exitCode=0 Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.055865 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1"} Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.055910 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerStarted","Data":"f00d7030b9f77eb48a44e61af48736a85e89f6cd7470963d43883183ef36faa7"} Mar 13 21:05:04 crc kubenswrapper[4790]: I0313 21:05:04.057419 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:05:05 crc kubenswrapper[4790]: I0313 21:05:05.067309 4790 generic.go:334] "Generic (PLEG): container finished" podID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" exitCode=0 Mar 13 21:05:05 crc kubenswrapper[4790]: I0313 21:05:05.067391 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad"} Mar 13 21:05:06 crc kubenswrapper[4790]: I0313 21:05:06.080272 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerStarted","Data":"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35"} Mar 13 21:05:06 crc kubenswrapper[4790]: I0313 21:05:06.101480 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-88dvw" podStartSLOduration=2.683805465 podStartE2EDuration="4.10146327s" podCreationTimestamp="2026-03-13 21:05:02 +0000 UTC" firstStartedPulling="2026-03-13 21:05:04.057181521 +0000 UTC m=+2235.078297412" lastFinishedPulling="2026-03-13 21:05:05.474839326 +0000 UTC m=+2236.495955217" observedRunningTime="2026-03-13 21:05:06.096891085 +0000 UTC m=+2237.118006986" watchObservedRunningTime="2026-03-13 21:05:06.10146327 +0000 UTC m=+2237.122579161" Mar 13 21:05:13 crc kubenswrapper[4790]: I0313 21:05:13.326729 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:13 crc kubenswrapper[4790]: I0313 21:05:13.327288 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:13 crc kubenswrapper[4790]: I0313 21:05:13.381438 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.015338 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.015413 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.213021 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:14 crc kubenswrapper[4790]: I0313 21:05:14.273562 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.192610 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-88dvw" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" containerID="cri-o://10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" gracePeriod=2 Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.646766 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821207 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") pod \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821365 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") pod \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821565 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") pod \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\" (UID: \"de983e6c-4ce2-42f6-94ed-44a141b2b39d\") " Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.821903 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities" (OuterVolumeSpecName: "utilities") pod "de983e6c-4ce2-42f6-94ed-44a141b2b39d" (UID: "de983e6c-4ce2-42f6-94ed-44a141b2b39d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.822128 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.827458 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7" (OuterVolumeSpecName: "kube-api-access-mpzc7") pod "de983e6c-4ce2-42f6-94ed-44a141b2b39d" (UID: "de983e6c-4ce2-42f6-94ed-44a141b2b39d"). InnerVolumeSpecName "kube-api-access-mpzc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.845649 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de983e6c-4ce2-42f6-94ed-44a141b2b39d" (UID: "de983e6c-4ce2-42f6-94ed-44a141b2b39d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.924281 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mpzc7\" (UniqueName: \"kubernetes.io/projected/de983e6c-4ce2-42f6-94ed-44a141b2b39d-kube-api-access-mpzc7\") on node \"crc\" DevicePath \"\"" Mar 13 21:05:16 crc kubenswrapper[4790]: I0313 21:05:16.924319 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de983e6c-4ce2-42f6-94ed-44a141b2b39d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204180 4790 generic.go:334] "Generic (PLEG): container finished" podID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" exitCode=0 Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204251 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35"} Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204315 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-88dvw" event={"ID":"de983e6c-4ce2-42f6-94ed-44a141b2b39d","Type":"ContainerDied","Data":"f00d7030b9f77eb48a44e61af48736a85e89f6cd7470963d43883183ef36faa7"} Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204318 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-88dvw" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.204353 4790 scope.go:117] "RemoveContainer" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.244791 4790 scope.go:117] "RemoveContainer" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.251930 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.261409 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-88dvw"] Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.278675 4790 scope.go:117] "RemoveContainer" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.315935 4790 scope.go:117] "RemoveContainer" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" Mar 13 21:05:17 crc kubenswrapper[4790]: E0313 21:05:17.316650 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35\": container with ID starting with 10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35 not found: ID does not exist" containerID="10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.316684 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35"} err="failed to get container status \"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35\": rpc error: code = NotFound desc = could not find container \"10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35\": container with ID starting with 10a14ddf25e93f79f621c13a15193a02a7cf706fe43474cf20f585d46862cc35 not found: ID does not exist" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.316709 4790 scope.go:117] "RemoveContainer" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" Mar 13 21:05:17 crc kubenswrapper[4790]: E0313 21:05:17.316965 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad\": container with ID starting with 89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad not found: ID does not exist" containerID="89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.316995 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad"} err="failed to get container status \"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad\": rpc error: code = NotFound desc = could not find container \"89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad\": container with ID starting with 89d396ea2099b01727d049fa95d312446f917d6fc53c54aa76e4c03f934f2cad not found: ID does not exist" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.317010 4790 scope.go:117] "RemoveContainer" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" Mar 13 21:05:17 crc kubenswrapper[4790]: E0313 21:05:17.317436 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1\": container with ID starting with 538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1 not found: ID does not exist" containerID="538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.317464 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1"} err="failed to get container status \"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1\": rpc error: code = NotFound desc = could not find container \"538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1\": container with ID starting with 538f094cbd74486037f753c3611730ee28578cdb59d56d5be516e368c14126b1 not found: ID does not exist" Mar 13 21:05:17 crc kubenswrapper[4790]: I0313 21:05:17.670164 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" path="/var/lib/kubelet/pods/de983e6c-4ce2-42f6-94ed-44a141b2b39d/volumes" Mar 13 21:05:44 crc kubenswrapper[4790]: I0313 21:05:44.016418 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:05:44 crc kubenswrapper[4790]: I0313 21:05:44.018576 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.148582 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:06:00 crc kubenswrapper[4790]: E0313 21:06:00.149439 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-utilities" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149450 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-utilities" Mar 13 21:06:00 crc kubenswrapper[4790]: E0313 21:06:00.149464 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-content" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149470 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="extract-content" Mar 13 21:06:00 crc kubenswrapper[4790]: E0313 21:06:00.149493 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149499 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.149731 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="de983e6c-4ce2-42f6-94ed-44a141b2b39d" containerName="registry-server" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.150364 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.153037 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.153172 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.153275 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.158693 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.336804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"auto-csr-approver-29557266-4sbr6\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.438614 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"auto-csr-approver-29557266-4sbr6\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.462336 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"auto-csr-approver-29557266-4sbr6\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.508752 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:00 crc kubenswrapper[4790]: I0313 21:06:00.939093 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:06:01 crc kubenswrapper[4790]: I0313 21:06:01.646448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" event={"ID":"6a921d70-847d-4a96-ad9a-18438299237e","Type":"ContainerStarted","Data":"859b08a740a73b3b98ee9d8f5c5d1673cf407fb993f939365f89333530343cb2"} Mar 13 21:06:02 crc kubenswrapper[4790]: I0313 21:06:02.657634 4790 generic.go:334] "Generic (PLEG): container finished" podID="6a921d70-847d-4a96-ad9a-18438299237e" containerID="3e0c0f63bb37da5c2b233a3b4a5d7ae121b4ed58aa4773dd2ed0d98e00fff307" exitCode=0 Mar 13 21:06:02 crc kubenswrapper[4790]: I0313 21:06:02.657680 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" event={"ID":"6a921d70-847d-4a96-ad9a-18438299237e","Type":"ContainerDied","Data":"3e0c0f63bb37da5c2b233a3b4a5d7ae121b4ed58aa4773dd2ed0d98e00fff307"} Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.001836 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.039633 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") pod \"6a921d70-847d-4a96-ad9a-18438299237e\" (UID: \"6a921d70-847d-4a96-ad9a-18438299237e\") " Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.045822 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql" (OuterVolumeSpecName: "kube-api-access-dgtql") pod "6a921d70-847d-4a96-ad9a-18438299237e" (UID: "6a921d70-847d-4a96-ad9a-18438299237e"). InnerVolumeSpecName "kube-api-access-dgtql". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.143080 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dgtql\" (UniqueName: \"kubernetes.io/projected/6a921d70-847d-4a96-ad9a-18438299237e-kube-api-access-dgtql\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.679419 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" event={"ID":"6a921d70-847d-4a96-ad9a-18438299237e","Type":"ContainerDied","Data":"859b08a740a73b3b98ee9d8f5c5d1673cf407fb993f939365f89333530343cb2"} Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.679461 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="859b08a740a73b3b98ee9d8f5c5d1673cf407fb993f939365f89333530343cb2" Mar 13 21:06:04 crc kubenswrapper[4790]: I0313 21:06:04.679525 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557266-4sbr6" Mar 13 21:06:05 crc kubenswrapper[4790]: I0313 21:06:05.074829 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:06:05 crc kubenswrapper[4790]: I0313 21:06:05.098692 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557260-m6wtk"] Mar 13 21:06:05 crc kubenswrapper[4790]: I0313 21:06:05.670814 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0" path="/var/lib/kubelet/pods/6b6fa0dd-2e8f-4ecd-a381-0bfc4a1e20b0/volumes" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.729674 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:08 crc kubenswrapper[4790]: E0313 21:06:08.730329 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a921d70-847d-4a96-ad9a-18438299237e" containerName="oc" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.730342 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a921d70-847d-4a96-ad9a-18438299237e" containerName="oc" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.730548 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a921d70-847d-4a96-ad9a-18438299237e" containerName="oc" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.731978 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.746790 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.831818 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.832279 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.832437 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.934957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935074 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935528 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.935623 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:08 crc kubenswrapper[4790]: I0313 21:06:08.967660 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"certified-operators-q7hwz\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:09 crc kubenswrapper[4790]: I0313 21:06:09.053965 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:09 crc kubenswrapper[4790]: I0313 21:06:09.605917 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:09 crc kubenswrapper[4790]: I0313 21:06:09.744781 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerStarted","Data":"b899db2cbc3a23de289484d34e35002cd8a84c89f220f29fe04a8a0a11619bb5"} Mar 13 21:06:10 crc kubenswrapper[4790]: I0313 21:06:10.755279 4790 generic.go:334] "Generic (PLEG): container finished" podID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" exitCode=0 Mar 13 21:06:10 crc kubenswrapper[4790]: I0313 21:06:10.755330 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0"} Mar 13 21:06:11 crc kubenswrapper[4790]: I0313 21:06:11.767767 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerStarted","Data":"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a"} Mar 13 21:06:12 crc kubenswrapper[4790]: I0313 21:06:12.779833 4790 generic.go:334] "Generic (PLEG): container finished" podID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" exitCode=0 Mar 13 21:06:12 crc kubenswrapper[4790]: I0313 21:06:12.779924 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a"} Mar 13 21:06:13 crc kubenswrapper[4790]: I0313 21:06:13.792835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerStarted","Data":"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6"} Mar 13 21:06:13 crc kubenswrapper[4790]: I0313 21:06:13.823502 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q7hwz" podStartSLOduration=3.375850526 podStartE2EDuration="5.823485297s" podCreationTimestamp="2026-03-13 21:06:08 +0000 UTC" firstStartedPulling="2026-03-13 21:06:10.756951744 +0000 UTC m=+2301.778067635" lastFinishedPulling="2026-03-13 21:06:13.204586515 +0000 UTC m=+2304.225702406" observedRunningTime="2026-03-13 21:06:13.81015065 +0000 UTC m=+2304.831266541" watchObservedRunningTime="2026-03-13 21:06:13.823485297 +0000 UTC m=+2304.844601188" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.016076 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.016167 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.016232 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.017368 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.017511 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" gracePeriod=600 Mar 13 21:06:14 crc kubenswrapper[4790]: E0313 21:06:14.153100 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.805470 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" exitCode=0 Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.805520 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5"} Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.805869 4790 scope.go:117] "RemoveContainer" containerID="a9a94b980a92050256811681ca21f1352e966795dd8d0d5b7f29e267e6b5c0a4" Mar 13 21:06:14 crc kubenswrapper[4790]: I0313 21:06:14.806548 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:14 crc kubenswrapper[4790]: E0313 21:06:14.806802 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.055111 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.056712 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.099776 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.890104 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:19 crc kubenswrapper[4790]: I0313 21:06:19.943015 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:21 crc kubenswrapper[4790]: I0313 21:06:21.869977 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q7hwz" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" containerID="cri-o://5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" gracePeriod=2 Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.335029 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.490248 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") pod \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.490356 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") pod \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.490509 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") pod \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\" (UID: \"07a0994d-f139-49d8-8d95-2b6ca52a0b84\") " Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.491321 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities" (OuterVolumeSpecName: "utilities") pod "07a0994d-f139-49d8-8d95-2b6ca52a0b84" (UID: "07a0994d-f139-49d8-8d95-2b6ca52a0b84"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.496815 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd" (OuterVolumeSpecName: "kube-api-access-xwqpd") pod "07a0994d-f139-49d8-8d95-2b6ca52a0b84" (UID: "07a0994d-f139-49d8-8d95-2b6ca52a0b84"). InnerVolumeSpecName "kube-api-access-xwqpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.592652 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.592915 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwqpd\" (UniqueName: \"kubernetes.io/projected/07a0994d-f139-49d8-8d95-2b6ca52a0b84-kube-api-access-xwqpd\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.880085 4790 generic.go:334] "Generic (PLEG): container finished" podID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" exitCode=0 Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.880158 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7hwz" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.881360 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6"} Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.881629 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7hwz" event={"ID":"07a0994d-f139-49d8-8d95-2b6ca52a0b84","Type":"ContainerDied","Data":"b899db2cbc3a23de289484d34e35002cd8a84c89f220f29fe04a8a0a11619bb5"} Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.881690 4790 scope.go:117] "RemoveContainer" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.900599 4790 scope.go:117] "RemoveContainer" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.920627 4790 scope.go:117] "RemoveContainer" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.961201 4790 scope.go:117] "RemoveContainer" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" Mar 13 21:06:22 crc kubenswrapper[4790]: E0313 21:06:22.961678 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6\": container with ID starting with 5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6 not found: ID does not exist" containerID="5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.961710 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6"} err="failed to get container status \"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6\": rpc error: code = NotFound desc = could not find container \"5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6\": container with ID starting with 5ce286ca554ed0c42303ab4f7f71de06b32a670bebdcbadb4bddf31c9cb470f6 not found: ID does not exist" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.961730 4790 scope.go:117] "RemoveContainer" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" Mar 13 21:06:22 crc kubenswrapper[4790]: E0313 21:06:22.962077 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a\": container with ID starting with a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a not found: ID does not exist" containerID="a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.962101 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a"} err="failed to get container status \"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a\": rpc error: code = NotFound desc = could not find container \"a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a\": container with ID starting with a1a099fa68b4c873fb295134ba0d37718cdf37dd4a9b1c3af079a3c182741f0a not found: ID does not exist" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.962116 4790 scope.go:117] "RemoveContainer" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" Mar 13 21:06:22 crc kubenswrapper[4790]: E0313 21:06:22.962545 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0\": container with ID starting with 6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0 not found: ID does not exist" containerID="6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.962578 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0"} err="failed to get container status \"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0\": rpc error: code = NotFound desc = could not find container \"6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0\": container with ID starting with 6439279eb663ad1eb5f422402402c63aaac0d91e617550278ad0ce17daea07a0 not found: ID does not exist" Mar 13 21:06:22 crc kubenswrapper[4790]: I0313 21:06:22.964734 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07a0994d-f139-49d8-8d95-2b6ca52a0b84" (UID: "07a0994d-f139-49d8-8d95-2b6ca52a0b84"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.000311 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07a0994d-f139-49d8-8d95-2b6ca52a0b84-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.235634 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.243895 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q7hwz"] Mar 13 21:06:23 crc kubenswrapper[4790]: I0313 21:06:23.671108 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" path="/var/lib/kubelet/pods/07a0994d-f139-49d8-8d95-2b6ca52a0b84/volumes" Mar 13 21:06:25 crc kubenswrapper[4790]: I0313 21:06:25.660782 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:25 crc kubenswrapper[4790]: E0313 21:06:25.661631 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:38 crc kubenswrapper[4790]: I0313 21:06:38.659908 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:38 crc kubenswrapper[4790]: E0313 21:06:38.660582 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:50 crc kubenswrapper[4790]: I0313 21:06:50.660579 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:06:50 crc kubenswrapper[4790]: E0313 21:06:50.661304 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:06:57 crc kubenswrapper[4790]: I0313 21:06:57.057634 4790 scope.go:117] "RemoveContainer" containerID="5bdb5fc52fc30f3ba02b7731679748560a6cefd0cfb24c581e1cc818e8a93cb1" Mar 13 21:07:04 crc kubenswrapper[4790]: I0313 21:07:04.660451 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:04 crc kubenswrapper[4790]: E0313 21:07:04.661370 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:16 crc kubenswrapper[4790]: I0313 21:07:16.660163 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:16 crc kubenswrapper[4790]: E0313 21:07:16.661059 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:24 crc kubenswrapper[4790]: I0313 21:07:24.413925 4790 generic.go:334] "Generic (PLEG): container finished" podID="c70cf667-ebdd-414d-be40-62d26209abcf" containerID="2a4b7cacb6bb56397aa8e80bce91be52e687845b109945911184f15eb741cc40" exitCode=0 Mar 13 21:07:24 crc kubenswrapper[4790]: I0313 21:07:24.413990 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerDied","Data":"2a4b7cacb6bb56397aa8e80bce91be52e687845b109945911184f15eb741cc40"} Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.882615 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915318 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915438 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915497 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915558 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.915584 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") pod \"c70cf667-ebdd-414d-be40-62d26209abcf\" (UID: \"c70cf667-ebdd-414d-be40-62d26209abcf\") " Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.921096 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl" (OuterVolumeSpecName: "kube-api-access-sd6kl") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "kube-api-access-sd6kl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.921652 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.943028 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.943411 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:25 crc kubenswrapper[4790]: I0313 21:07:25.946090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory" (OuterVolumeSpecName: "inventory") pod "c70cf667-ebdd-414d-be40-62d26209abcf" (UID: "c70cf667-ebdd-414d-be40-62d26209abcf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018190 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018234 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018257 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018269 4790 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c70cf667-ebdd-414d-be40-62d26209abcf-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.018284 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd6kl\" (UniqueName: \"kubernetes.io/projected/c70cf667-ebdd-414d-be40-62d26209abcf-kube-api-access-sd6kl\") on node \"crc\" DevicePath \"\"" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.441122 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" event={"ID":"c70cf667-ebdd-414d-be40-62d26209abcf","Type":"ContainerDied","Data":"750819b8447adbdcf460745d4cc408a88dcc52443bc7524ebb6bbcda342e2ca3"} Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.441169 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750819b8447adbdcf460745d4cc408a88dcc52443bc7524ebb6bbcda342e2ca3" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.441174 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.536011 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv"] Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.538445 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.538821 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.538959 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-utilities" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539042 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-utilities" Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.539118 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c70cf667-ebdd-414d-be40-62d26209abcf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539196 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c70cf667-ebdd-414d-be40-62d26209abcf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: E0313 21:07:26.539276 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-content" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539348 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="extract-content" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539717 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c70cf667-ebdd-414d-be40-62d26209abcf" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.539807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="07a0994d-f139-49d8-8d95-2b6ca52a0b84" containerName="registry-server" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.540699 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.544238 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.544588 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.544841 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545285 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545510 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545811 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.545994 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.547488 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv"] Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629631 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629656 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629787 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629851 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.629986 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630072 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630186 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.630316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732035 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732438 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732458 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732502 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732525 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732558 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732573 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732599 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732615 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732644 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.732689 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.734033 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.736045 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.736812 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737505 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737579 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.737789 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.738930 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.739006 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.739106 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.749738 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"nova-edpm-deployment-openstack-edpm-ipam-lhxzv\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:26 crc kubenswrapper[4790]: I0313 21:07:26.863341 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:07:27 crc kubenswrapper[4790]: I0313 21:07:27.361008 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv"] Mar 13 21:07:27 crc kubenswrapper[4790]: I0313 21:07:27.451912 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerStarted","Data":"5e955eaeb50a09aee8260c883bbe8ca2342e2062895d307915c8671ae2b82195"} Mar 13 21:07:27 crc kubenswrapper[4790]: I0313 21:07:27.659840 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:27 crc kubenswrapper[4790]: E0313 21:07:27.660264 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:28 crc kubenswrapper[4790]: I0313 21:07:28.460048 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerStarted","Data":"e137a0d045ac7d76f4b1c0eb61bbf159c0b0eb2b656ff90d4306bffd4b4bb7f4"} Mar 13 21:07:28 crc kubenswrapper[4790]: I0313 21:07:28.482240 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" podStartSLOduration=2.037676662 podStartE2EDuration="2.482214065s" podCreationTimestamp="2026-03-13 21:07:26 +0000 UTC" firstStartedPulling="2026-03-13 21:07:27.357646321 +0000 UTC m=+2378.378762212" lastFinishedPulling="2026-03-13 21:07:27.802183724 +0000 UTC m=+2378.823299615" observedRunningTime="2026-03-13 21:07:28.47691398 +0000 UTC m=+2379.498029871" watchObservedRunningTime="2026-03-13 21:07:28.482214065 +0000 UTC m=+2379.503329956" Mar 13 21:07:41 crc kubenswrapper[4790]: I0313 21:07:41.660433 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:41 crc kubenswrapper[4790]: E0313 21:07:41.662714 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:07:56 crc kubenswrapper[4790]: I0313 21:07:56.660145 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:07:56 crc kubenswrapper[4790]: E0313 21:07:56.661058 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.145232 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.147096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.162247 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.165031 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.165864 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.166099 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.224886 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"auto-csr-approver-29557268-fbrfb\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.327504 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"auto-csr-approver-29557268-fbrfb\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.346604 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"auto-csr-approver-29557268-fbrfb\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.471520 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:00 crc kubenswrapper[4790]: I0313 21:08:00.966779 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:08:00 crc kubenswrapper[4790]: W0313 21:08:00.975703 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f962ddd_b18b_43c7_81e1_7eda48d64d88.slice/crio-c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc WatchSource:0}: Error finding container c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc: Status 404 returned error can't find the container with id c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc Mar 13 21:08:01 crc kubenswrapper[4790]: I0313 21:08:01.731298 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" event={"ID":"5f962ddd-b18b-43c7-81e1-7eda48d64d88","Type":"ContainerStarted","Data":"c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc"} Mar 13 21:08:02 crc kubenswrapper[4790]: I0313 21:08:02.742608 4790 generic.go:334] "Generic (PLEG): container finished" podID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerID="bdca2f8da697e12973555a54d7d0753abfb943fd0d2919dd4adb4178a3e9c052" exitCode=0 Mar 13 21:08:02 crc kubenswrapper[4790]: I0313 21:08:02.742673 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" event={"ID":"5f962ddd-b18b-43c7-81e1-7eda48d64d88","Type":"ContainerDied","Data":"bdca2f8da697e12973555a54d7d0753abfb943fd0d2919dd4adb4178a3e9c052"} Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.046838 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.204235 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") pod \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\" (UID: \"5f962ddd-b18b-43c7-81e1-7eda48d64d88\") " Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.211047 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx" (OuterVolumeSpecName: "kube-api-access-qsrpx") pod "5f962ddd-b18b-43c7-81e1-7eda48d64d88" (UID: "5f962ddd-b18b-43c7-81e1-7eda48d64d88"). InnerVolumeSpecName "kube-api-access-qsrpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.306941 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qsrpx\" (UniqueName: \"kubernetes.io/projected/5f962ddd-b18b-43c7-81e1-7eda48d64d88-kube-api-access-qsrpx\") on node \"crc\" DevicePath \"\"" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.760233 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" event={"ID":"5f962ddd-b18b-43c7-81e1-7eda48d64d88","Type":"ContainerDied","Data":"c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc"} Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.760743 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c60da4f485099ecee890af9941d9bfee6803705f73e9055e04d52a650268e3bc" Mar 13 21:08:04 crc kubenswrapper[4790]: I0313 21:08:04.760294 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557268-fbrfb" Mar 13 21:08:05 crc kubenswrapper[4790]: I0313 21:08:05.117282 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:08:05 crc kubenswrapper[4790]: I0313 21:08:05.128360 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557262-wdkmw"] Mar 13 21:08:05 crc kubenswrapper[4790]: I0313 21:08:05.670889 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b5219d2-3afd-4a8d-ab26-3102b6dee3b0" path="/var/lib/kubelet/pods/5b5219d2-3afd-4a8d-ab26-3102b6dee3b0/volumes" Mar 13 21:08:11 crc kubenswrapper[4790]: I0313 21:08:11.660467 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:11 crc kubenswrapper[4790]: E0313 21:08:11.661371 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:25 crc kubenswrapper[4790]: I0313 21:08:25.660280 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:25 crc kubenswrapper[4790]: E0313 21:08:25.661168 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:38 crc kubenswrapper[4790]: I0313 21:08:38.659575 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:38 crc kubenswrapper[4790]: E0313 21:08:38.660419 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:50 crc kubenswrapper[4790]: I0313 21:08:50.660504 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:08:50 crc kubenswrapper[4790]: E0313 21:08:50.661504 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:08:57 crc kubenswrapper[4790]: I0313 21:08:57.167618 4790 scope.go:117] "RemoveContainer" containerID="c9fc9237e156eb0becb6b2dc2279bf5dc16eec046e67e33454f05890e75163e2" Mar 13 21:09:01 crc kubenswrapper[4790]: I0313 21:09:01.660114 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:01 crc kubenswrapper[4790]: E0313 21:09:01.660942 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:15 crc kubenswrapper[4790]: I0313 21:09:15.660757 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:15 crc kubenswrapper[4790]: E0313 21:09:15.661564 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:28 crc kubenswrapper[4790]: I0313 21:09:28.659693 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:28 crc kubenswrapper[4790]: E0313 21:09:28.661558 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.224511 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:33 crc kubenswrapper[4790]: E0313 21:09:33.228853 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerName="oc" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.228910 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerName="oc" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.229884 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" containerName="oc" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.242067 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.253892 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.337874 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.337945 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.338107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.439679 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.439830 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.439863 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.440201 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.440365 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.467195 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"community-operators-692s5\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:33 crc kubenswrapper[4790]: I0313 21:09:33.578736 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.181973 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.539099 4790 generic.go:334] "Generic (PLEG): container finished" podID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" exitCode=0 Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.540604 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579"} Mar 13 21:09:34 crc kubenswrapper[4790]: I0313 21:09:34.540900 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerStarted","Data":"f098f8de58f4430eacc872f8239f13bde3881a4b8d296b404f354b27ab3de96c"} Mar 13 21:09:36 crc kubenswrapper[4790]: I0313 21:09:36.560907 4790 generic.go:334] "Generic (PLEG): container finished" podID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" exitCode=0 Mar 13 21:09:36 crc kubenswrapper[4790]: I0313 21:09:36.560961 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec"} Mar 13 21:09:37 crc kubenswrapper[4790]: I0313 21:09:37.571596 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerStarted","Data":"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f"} Mar 13 21:09:37 crc kubenswrapper[4790]: I0313 21:09:37.599014 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-692s5" podStartSLOduration=2.023030629 podStartE2EDuration="4.598990456s" podCreationTimestamp="2026-03-13 21:09:33 +0000 UTC" firstStartedPulling="2026-03-13 21:09:34.544180948 +0000 UTC m=+2505.565296839" lastFinishedPulling="2026-03-13 21:09:37.120140785 +0000 UTC m=+2508.141256666" observedRunningTime="2026-03-13 21:09:37.586181994 +0000 UTC m=+2508.607297895" watchObservedRunningTime="2026-03-13 21:09:37.598990456 +0000 UTC m=+2508.620106347" Mar 13 21:09:39 crc kubenswrapper[4790]: I0313 21:09:39.590695 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerID="e137a0d045ac7d76f4b1c0eb61bbf159c0b0eb2b656ff90d4306bffd4b4bb7f4" exitCode=0 Mar 13 21:09:39 crc kubenswrapper[4790]: I0313 21:09:39.591472 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerDied","Data":"e137a0d045ac7d76f4b1c0eb61bbf159c0b0eb2b656ff90d4306bffd4b4bb7f4"} Mar 13 21:09:39 crc kubenswrapper[4790]: I0313 21:09:39.670503 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:39 crc kubenswrapper[4790]: E0313 21:09:39.670867 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:09:40 crc kubenswrapper[4790]: I0313 21:09:40.995924 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.085815 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.085897 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.085934 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086010 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086062 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086151 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086180 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086209 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086261 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086299 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.086340 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") pod \"7b947c94-305d-453d-b2f0-bcf3c84467b3\" (UID: \"7b947c94-305d-453d-b2f0-bcf3c84467b3\") " Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.092041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg" (OuterVolumeSpecName: "kube-api-access-44jvg") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "kube-api-access-44jvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.108752 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.118597 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.121806 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.132077 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.133544 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.137841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.140183 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.141841 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.143124 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.161437 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory" (OuterVolumeSpecName: "inventory") pod "7b947c94-305d-453d-b2f0-bcf3c84467b3" (UID: "7b947c94-305d-453d-b2f0-bcf3c84467b3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189212 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189252 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44jvg\" (UniqueName: \"kubernetes.io/projected/7b947c94-305d-453d-b2f0-bcf3c84467b3-kube-api-access-44jvg\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189261 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189271 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189281 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189290 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189301 4790 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189312 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189322 4790 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189333 4790 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.189345 4790 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/7b947c94-305d-453d-b2f0-bcf3c84467b3-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.611179 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" event={"ID":"7b947c94-305d-453d-b2f0-bcf3c84467b3","Type":"ContainerDied","Data":"5e955eaeb50a09aee8260c883bbe8ca2342e2062895d307915c8671ae2b82195"} Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.611573 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e955eaeb50a09aee8260c883bbe8ca2342e2062895d307915c8671ae2b82195" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.611245 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-lhxzv" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.710525 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr"] Mar 13 21:09:41 crc kubenswrapper[4790]: E0313 21:09:41.711161 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.711189 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.711452 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b947c94-305d-453d-b2f0-bcf3c84467b3" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.712144 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716113 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716269 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716278 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-r5n8m" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716285 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.716368 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.723388 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr"] Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801044 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801433 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801518 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801790 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.801959 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.802016 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.802132 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904625 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904676 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904703 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904771 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904828 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904853 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.904911 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908545 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908566 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908622 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.908640 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.909517 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.910182 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:41 crc kubenswrapper[4790]: I0313 21:09:41.923154 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:42 crc kubenswrapper[4790]: I0313 21:09:42.106736 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:09:42 crc kubenswrapper[4790]: I0313 21:09:42.616492 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr"] Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.579352 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.579661 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.629667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerStarted","Data":"fbd28e9d93d15c499d0b1595969d50c3452959738f825ad9e7b6e4609348ae9c"} Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.629736 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerStarted","Data":"1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95"} Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.635429 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.656669 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" podStartSLOduration=2.24259048 podStartE2EDuration="2.656640431s" podCreationTimestamp="2026-03-13 21:09:41 +0000 UTC" firstStartedPulling="2026-03-13 21:09:42.618344224 +0000 UTC m=+2513.639460115" lastFinishedPulling="2026-03-13 21:09:43.032394175 +0000 UTC m=+2514.053510066" observedRunningTime="2026-03-13 21:09:43.647096072 +0000 UTC m=+2514.668212003" watchObservedRunningTime="2026-03-13 21:09:43.656640431 +0000 UTC m=+2514.677756342" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.693680 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:43 crc kubenswrapper[4790]: I0313 21:09:43.896863 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:45 crc kubenswrapper[4790]: I0313 21:09:45.642752 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-692s5" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" containerID="cri-o://3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" gracePeriod=2 Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.113288 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.291919 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") pod \"e70f4ff5-2cd5-4915-978d-dfb989d52730\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.291997 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") pod \"e70f4ff5-2cd5-4915-978d-dfb989d52730\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.292133 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") pod \"e70f4ff5-2cd5-4915-978d-dfb989d52730\" (UID: \"e70f4ff5-2cd5-4915-978d-dfb989d52730\") " Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.292913 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities" (OuterVolumeSpecName: "utilities") pod "e70f4ff5-2cd5-4915-978d-dfb989d52730" (UID: "e70f4ff5-2cd5-4915-978d-dfb989d52730"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.293733 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.303731 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl" (OuterVolumeSpecName: "kube-api-access-9qzdl") pod "e70f4ff5-2cd5-4915-978d-dfb989d52730" (UID: "e70f4ff5-2cd5-4915-978d-dfb989d52730"). InnerVolumeSpecName "kube-api-access-9qzdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.395954 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qzdl\" (UniqueName: \"kubernetes.io/projected/e70f4ff5-2cd5-4915-978d-dfb989d52730-kube-api-access-9qzdl\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652504 4790 generic.go:334] "Generic (PLEG): container finished" podID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" exitCode=0 Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652556 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f"} Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652588 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-692s5" event={"ID":"e70f4ff5-2cd5-4915-978d-dfb989d52730","Type":"ContainerDied","Data":"f098f8de58f4430eacc872f8239f13bde3881a4b8d296b404f354b27ab3de96c"} Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652584 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-692s5" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.652606 4790 scope.go:117] "RemoveContainer" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.676041 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e70f4ff5-2cd5-4915-978d-dfb989d52730" (UID: "e70f4ff5-2cd5-4915-978d-dfb989d52730"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.677823 4790 scope.go:117] "RemoveContainer" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.699176 4790 scope.go:117] "RemoveContainer" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.700802 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e70f4ff5-2cd5-4915-978d-dfb989d52730-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.742600 4790 scope.go:117] "RemoveContainer" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" Mar 13 21:09:46 crc kubenswrapper[4790]: E0313 21:09:46.743013 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f\": container with ID starting with 3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f not found: ID does not exist" containerID="3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743042 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f"} err="failed to get container status \"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f\": rpc error: code = NotFound desc = could not find container \"3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f\": container with ID starting with 3969ff24b5b9fec946b0460151a7bffaa9df43352730008a167f2bf4672fd47f not found: ID does not exist" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743064 4790 scope.go:117] "RemoveContainer" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" Mar 13 21:09:46 crc kubenswrapper[4790]: E0313 21:09:46.743506 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec\": container with ID starting with dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec not found: ID does not exist" containerID="dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743553 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec"} err="failed to get container status \"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec\": rpc error: code = NotFound desc = could not find container \"dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec\": container with ID starting with dbf0f2ebae2b53a631b13b83ca757955eedb6cbb35341b4b4454b85dcdac51ec not found: ID does not exist" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.743589 4790 scope.go:117] "RemoveContainer" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" Mar 13 21:09:46 crc kubenswrapper[4790]: E0313 21:09:46.743980 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579\": container with ID starting with 54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579 not found: ID does not exist" containerID="54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.744009 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579"} err="failed to get container status \"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579\": rpc error: code = NotFound desc = could not find container \"54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579\": container with ID starting with 54054605b065d4065b14fefe9ba5cccd75ef0813fb8b5985609283339aa3b579 not found: ID does not exist" Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.988986 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:46 crc kubenswrapper[4790]: I0313 21:09:46.997822 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-692s5"] Mar 13 21:09:47 crc kubenswrapper[4790]: I0313 21:09:47.675956 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" path="/var/lib/kubelet/pods/e70f4ff5-2cd5-4915-978d-dfb989d52730/volumes" Mar 13 21:09:53 crc kubenswrapper[4790]: I0313 21:09:53.661302 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:09:53 crc kubenswrapper[4790]: E0313 21:09:53.663175 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.145283 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:10:00 crc kubenswrapper[4790]: E0313 21:10:00.146497 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146516 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" Mar 13 21:10:00 crc kubenswrapper[4790]: E0313 21:10:00.146543 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-utilities" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146552 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-utilities" Mar 13 21:10:00 crc kubenswrapper[4790]: E0313 21:10:00.146572 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-content" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146582 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="extract-content" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.146837 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e70f4ff5-2cd5-4915-978d-dfb989d52730" containerName="registry-server" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.147833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.150121 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"auto-csr-approver-29557270-xndpr\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.153183 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.153370 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.153548 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.158597 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.251725 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"auto-csr-approver-29557270-xndpr\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.271334 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"auto-csr-approver-29557270-xndpr\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:00 crc kubenswrapper[4790]: I0313 21:10:00.470096 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:01 crc kubenswrapper[4790]: I0313 21:10:00.890508 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:10:01 crc kubenswrapper[4790]: W0313 21:10:00.893833 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68f751f6_8e31_448a_99e9_bf7f290684be.slice/crio-d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01 WatchSource:0}: Error finding container d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01: Status 404 returned error can't find the container with id d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01 Mar 13 21:10:01 crc kubenswrapper[4790]: I0313 21:10:01.797422 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557270-xndpr" event={"ID":"68f751f6-8e31-448a-99e9-bf7f290684be","Type":"ContainerStarted","Data":"d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01"} Mar 13 21:10:02 crc kubenswrapper[4790]: I0313 21:10:02.806812 4790 generic.go:334] "Generic (PLEG): container finished" podID="68f751f6-8e31-448a-99e9-bf7f290684be" containerID="231c0b730759ce0ec6fa00fad0e521d17888055794c0179d0b2c116cf68aaf15" exitCode=0 Mar 13 21:10:02 crc kubenswrapper[4790]: I0313 21:10:02.806899 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557270-xndpr" event={"ID":"68f751f6-8e31-448a-99e9-bf7f290684be","Type":"ContainerDied","Data":"231c0b730759ce0ec6fa00fad0e521d17888055794c0179d0b2c116cf68aaf15"} Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.128605 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.326041 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") pod \"68f751f6-8e31-448a-99e9-bf7f290684be\" (UID: \"68f751f6-8e31-448a-99e9-bf7f290684be\") " Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.333045 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr" (OuterVolumeSpecName: "kube-api-access-r7ftr") pod "68f751f6-8e31-448a-99e9-bf7f290684be" (UID: "68f751f6-8e31-448a-99e9-bf7f290684be"). InnerVolumeSpecName "kube-api-access-r7ftr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.428168 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r7ftr\" (UniqueName: \"kubernetes.io/projected/68f751f6-8e31-448a-99e9-bf7f290684be-kube-api-access-r7ftr\") on node \"crc\" DevicePath \"\"" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.825405 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557270-xndpr" event={"ID":"68f751f6-8e31-448a-99e9-bf7f290684be","Type":"ContainerDied","Data":"d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01"} Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.825454 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d15c15610e5327ea8bfd867c70011f229b7e35ab77317203ec82da4de9241a01" Mar 13 21:10:04 crc kubenswrapper[4790]: I0313 21:10:04.825519 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557270-xndpr" Mar 13 21:10:05 crc kubenswrapper[4790]: I0313 21:10:05.185405 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:10:05 crc kubenswrapper[4790]: I0313 21:10:05.193836 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557264-b5j85"] Mar 13 21:10:05 crc kubenswrapper[4790]: I0313 21:10:05.679081 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33811d20-0fb8-4b06-a9dd-d2488b19d7b9" path="/var/lib/kubelet/pods/33811d20-0fb8-4b06-a9dd-d2488b19d7b9/volumes" Mar 13 21:10:06 crc kubenswrapper[4790]: I0313 21:10:06.660047 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:06 crc kubenswrapper[4790]: E0313 21:10:06.660877 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:20 crc kubenswrapper[4790]: I0313 21:10:20.660665 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:20 crc kubenswrapper[4790]: E0313 21:10:20.661616 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:33 crc kubenswrapper[4790]: I0313 21:10:33.660777 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:33 crc kubenswrapper[4790]: E0313 21:10:33.661618 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:45 crc kubenswrapper[4790]: I0313 21:10:45.661340 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:45 crc kubenswrapper[4790]: E0313 21:10:45.662075 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:10:57 crc kubenswrapper[4790]: I0313 21:10:57.294363 4790 scope.go:117] "RemoveContainer" containerID="3a443bd9f4b8d1df7af93baf309b6b85a45139407ed6e8e7a9df32fd174d2a54" Mar 13 21:10:58 crc kubenswrapper[4790]: I0313 21:10:58.660424 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:10:58 crc kubenswrapper[4790]: E0313 21:10:58.661209 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:11:10 crc kubenswrapper[4790]: I0313 21:11:10.660140 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:11:10 crc kubenswrapper[4790]: E0313 21:11:10.661824 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:11:25 crc kubenswrapper[4790]: I0313 21:11:25.660304 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:11:26 crc kubenswrapper[4790]: I0313 21:11:26.561328 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795"} Mar 13 21:11:48 crc kubenswrapper[4790]: I0313 21:11:48.760988 4790 generic.go:334] "Generic (PLEG): container finished" podID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerID="fbd28e9d93d15c499d0b1595969d50c3452959738f825ad9e7b6e4609348ae9c" exitCode=0 Mar 13 21:11:48 crc kubenswrapper[4790]: I0313 21:11:48.761097 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerDied","Data":"fbd28e9d93d15c499d0b1595969d50c3452959738f825ad9e7b6e4609348ae9c"} Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.168387 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276597 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276661 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276727 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276752 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276775 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276845 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.276909 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") pod \"71b17a66-faf5-4379-ace9-a4fff12cac5b\" (UID: \"71b17a66-faf5-4379-ace9-a4fff12cac5b\") " Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.293811 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw" (OuterVolumeSpecName: "kube-api-access-wd6pw") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "kube-api-access-wd6pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.293867 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.306021 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.307360 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.312467 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory" (OuterVolumeSpecName: "inventory") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.316305 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.316673 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "71b17a66-faf5-4379-ace9-a4fff12cac5b" (UID: "71b17a66-faf5-4379-ace9-a4fff12cac5b"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405025 4790 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-inventory\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405074 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405130 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405146 4790 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405159 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wd6pw\" (UniqueName: \"kubernetes.io/projected/71b17a66-faf5-4379-ace9-a4fff12cac5b-kube-api-access-wd6pw\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405178 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.405192 4790 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71b17a66-faf5-4379-ace9-a4fff12cac5b-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.780079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" event={"ID":"71b17a66-faf5-4379-ace9-a4fff12cac5b","Type":"ContainerDied","Data":"1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95"} Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.780123 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95" Mar 13 21:11:50 crc kubenswrapper[4790]: I0313 21:11:50.780190 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr" Mar 13 21:11:50 crc kubenswrapper[4790]: E0313 21:11:50.934513 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71b17a66_faf5_4379_ace9_a4fff12cac5b.slice/crio-1ebb38435b2d88705a4cd2dfc97f54f6a8a51f0502550183f97cc9f577d6ca95\": RecentStats: unable to find data in memory cache]" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.148307 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:12:00 crc kubenswrapper[4790]: E0313 21:12:00.149186 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149202 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 21:12:00 crc kubenswrapper[4790]: E0313 21:12:00.149243 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" containerName="oc" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149249 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" containerName="oc" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149434 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" containerName="oc" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.149449 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="71b17a66-faf5-4379-ace9-a4fff12cac5b" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.150048 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.152321 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.152729 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.152791 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.156802 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.279450 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"auto-csr-approver-29557272-cdqgf\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.380846 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"auto-csr-approver-29557272-cdqgf\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.403019 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"auto-csr-approver-29557272-cdqgf\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.467305 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.897127 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.902665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:12:00 crc kubenswrapper[4790]: I0313 21:12:00.960523 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" event={"ID":"7b51130e-f39a-4a4c-b41e-c865e51004dd","Type":"ContainerStarted","Data":"1fa852d5331967e3d23cf1b3419e4624321eec271574a6ac797f9a29e8389d08"} Mar 13 21:12:02 crc kubenswrapper[4790]: I0313 21:12:02.978982 4790 generic.go:334] "Generic (PLEG): container finished" podID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerID="4ea25a336829635b84ca0d8e478c73129cc595166d50214c193658a79404456f" exitCode=0 Mar 13 21:12:02 crc kubenswrapper[4790]: I0313 21:12:02.979030 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" event={"ID":"7b51130e-f39a-4a4c-b41e-c865e51004dd","Type":"ContainerDied","Data":"4ea25a336829635b84ca0d8e478c73129cc595166d50214c193658a79404456f"} Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.305969 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.452499 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") pod \"7b51130e-f39a-4a4c-b41e-c865e51004dd\" (UID: \"7b51130e-f39a-4a4c-b41e-c865e51004dd\") " Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.458021 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq" (OuterVolumeSpecName: "kube-api-access-qdcpq") pod "7b51130e-f39a-4a4c-b41e-c865e51004dd" (UID: "7b51130e-f39a-4a4c-b41e-c865e51004dd"). InnerVolumeSpecName "kube-api-access-qdcpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.555328 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qdcpq\" (UniqueName: \"kubernetes.io/projected/7b51130e-f39a-4a4c-b41e-c865e51004dd-kube-api-access-qdcpq\") on node \"crc\" DevicePath \"\"" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.996196 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" event={"ID":"7b51130e-f39a-4a4c-b41e-c865e51004dd","Type":"ContainerDied","Data":"1fa852d5331967e3d23cf1b3419e4624321eec271574a6ac797f9a29e8389d08"} Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.996237 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557272-cdqgf" Mar 13 21:12:04 crc kubenswrapper[4790]: I0313 21:12:04.996239 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fa852d5331967e3d23cf1b3419e4624321eec271574a6ac797f9a29e8389d08" Mar 13 21:12:05 crc kubenswrapper[4790]: I0313 21:12:05.370893 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:12:05 crc kubenswrapper[4790]: I0313 21:12:05.380002 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557266-4sbr6"] Mar 13 21:12:05 crc kubenswrapper[4790]: I0313 21:12:05.672047 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a921d70-847d-4a96-ad9a-18438299237e" path="/var/lib/kubelet/pods/6a921d70-847d-4a96-ad9a-18438299237e/volumes" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.878031 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 21:12:42 crc kubenswrapper[4790]: E0313 21:12:42.879774 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerName="oc" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.879800 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerName="oc" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.880070 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" containerName="oc" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.880910 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.883179 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.883197 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.883951 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.884017 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-xndhm" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.895547 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992181 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992275 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992301 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992324 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992456 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992577 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992710 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992731 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:42 crc kubenswrapper[4790]: I0313 21:12:42.992819 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.094976 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095032 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095092 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095133 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095197 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095218 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095241 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095288 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095340 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095573 4790 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.095992 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.096333 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.096448 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.096687 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.102840 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.103010 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.104508 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.116534 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.126845 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"tempest-tests-tempest\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.248568 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:12:43 crc kubenswrapper[4790]: I0313 21:12:43.681855 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 13 21:12:44 crc kubenswrapper[4790]: I0313 21:12:44.360188 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerStarted","Data":"9ae048789cc06b95d8d9a690f59586791ebaca094ac82840d0dd227be9680876"} Mar 13 21:12:57 crc kubenswrapper[4790]: I0313 21:12:57.391363 4790 scope.go:117] "RemoveContainer" containerID="3e0c0f63bb37da5c2b233a3b4a5d7ae121b4ed58aa4773dd2ed0d98e00fff307" Mar 13 21:13:18 crc kubenswrapper[4790]: E0313 21:13:18.862446 4790 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 13 21:13:18 crc kubenswrapper[4790]: E0313 21:13:18.863028 4790 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pbzlr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(50c1f858-4451-4e6e-9e80-6e37528305a2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 21:13:18 crc kubenswrapper[4790]: E0313 21:13:18.864180 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" Mar 13 21:13:19 crc kubenswrapper[4790]: E0313 21:13:19.718018 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" Mar 13 21:13:31 crc kubenswrapper[4790]: I0313 21:13:31.049441 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 13 21:13:32 crc kubenswrapper[4790]: I0313 21:13:32.830175 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerStarted","Data":"5f3dc8212dd652060ecb9c9d45ce324d2168353ccf633608e4415a58fb8949f8"} Mar 13 21:13:32 crc kubenswrapper[4790]: I0313 21:13:32.853223 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.491328204 podStartE2EDuration="51.853206943s" podCreationTimestamp="2026-03-13 21:12:41 +0000 UTC" firstStartedPulling="2026-03-13 21:12:43.684009242 +0000 UTC m=+2694.705125153" lastFinishedPulling="2026-03-13 21:13:31.045888001 +0000 UTC m=+2742.067003892" observedRunningTime="2026-03-13 21:13:32.850231069 +0000 UTC m=+2743.871346960" watchObservedRunningTime="2026-03-13 21:13:32.853206943 +0000 UTC m=+2743.874322824" Mar 13 21:13:44 crc kubenswrapper[4790]: I0313 21:13:44.015705 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:13:44 crc kubenswrapper[4790]: I0313 21:13:44.016202 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.146822 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.149834 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.152539 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.152709 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.153191 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.157155 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.246943 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"auto-csr-approver-29557274-6gbsg\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.349580 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"auto-csr-approver-29557274-6gbsg\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.370539 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"auto-csr-approver-29557274-6gbsg\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.479318 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:00 crc kubenswrapper[4790]: I0313 21:14:00.911763 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:14:01 crc kubenswrapper[4790]: I0313 21:14:01.274011 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" event={"ID":"657afd7a-d901-4df2-96d4-239bf59388bd","Type":"ContainerStarted","Data":"940dcc75d9bf37c5e0b1e03ecf8787b3eb3a2b6a8a9f7ed159c34b124bbd9616"} Mar 13 21:14:03 crc kubenswrapper[4790]: I0313 21:14:03.291780 4790 generic.go:334] "Generic (PLEG): container finished" podID="657afd7a-d901-4df2-96d4-239bf59388bd" containerID="d277a0373c5a7461ab377865cd1179cec1bb76b46da5d05b6de42a92acf13b80" exitCode=0 Mar 13 21:14:03 crc kubenswrapper[4790]: I0313 21:14:03.292043 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" event={"ID":"657afd7a-d901-4df2-96d4-239bf59388bd","Type":"ContainerDied","Data":"d277a0373c5a7461ab377865cd1179cec1bb76b46da5d05b6de42a92acf13b80"} Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.428104 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.434171 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.445291 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.529420 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.529716 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.529770 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.630949 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.631121 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.631820 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.631222 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.632055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.654302 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"redhat-operators-v9cjw\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.731350 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.733094 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") pod \"657afd7a-d901-4df2-96d4-239bf59388bd\" (UID: \"657afd7a-d901-4df2-96d4-239bf59388bd\") " Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.736516 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr" (OuterVolumeSpecName: "kube-api-access-s4pjr") pod "657afd7a-d901-4df2-96d4-239bf59388bd" (UID: "657afd7a-d901-4df2-96d4-239bf59388bd"). InnerVolumeSpecName "kube-api-access-s4pjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.757279 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:04 crc kubenswrapper[4790]: I0313 21:14:04.835765 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4pjr\" (UniqueName: \"kubernetes.io/projected/657afd7a-d901-4df2-96d4-239bf59388bd-kube-api-access-s4pjr\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:05 crc kubenswrapper[4790]: W0313 21:14:05.226745 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf31292a3_5896_4bab_bd6e_bc45dffabc58.slice/crio-37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272 WatchSource:0}: Error finding container 37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272: Status 404 returned error can't find the container with id 37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272 Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.229973 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.323801 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerStarted","Data":"37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272"} Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.326763 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" event={"ID":"657afd7a-d901-4df2-96d4-239bf59388bd","Type":"ContainerDied","Data":"940dcc75d9bf37c5e0b1e03ecf8787b3eb3a2b6a8a9f7ed159c34b124bbd9616"} Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.326824 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="940dcc75d9bf37c5e0b1e03ecf8787b3eb3a2b6a8a9f7ed159c34b124bbd9616" Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.326883 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557274-6gbsg" Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.845180 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:14:05 crc kubenswrapper[4790]: I0313 21:14:05.864695 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557268-fbrfb"] Mar 13 21:14:06 crc kubenswrapper[4790]: I0313 21:14:06.335308 4790 generic.go:334] "Generic (PLEG): container finished" podID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" exitCode=0 Mar 13 21:14:06 crc kubenswrapper[4790]: I0313 21:14:06.335406 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e"} Mar 13 21:14:07 crc kubenswrapper[4790]: I0313 21:14:07.673228 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f962ddd-b18b-43c7-81e1-7eda48d64d88" path="/var/lib/kubelet/pods/5f962ddd-b18b-43c7-81e1-7eda48d64d88/volumes" Mar 13 21:14:08 crc kubenswrapper[4790]: I0313 21:14:08.357589 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerStarted","Data":"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac"} Mar 13 21:14:10 crc kubenswrapper[4790]: I0313 21:14:10.376478 4790 generic.go:334] "Generic (PLEG): container finished" podID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" exitCode=0 Mar 13 21:14:10 crc kubenswrapper[4790]: I0313 21:14:10.376567 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac"} Mar 13 21:14:11 crc kubenswrapper[4790]: I0313 21:14:11.398988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerStarted","Data":"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a"} Mar 13 21:14:11 crc kubenswrapper[4790]: I0313 21:14:11.430825 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-v9cjw" podStartSLOduration=2.741520109 podStartE2EDuration="7.430807037s" podCreationTimestamp="2026-03-13 21:14:04 +0000 UTC" firstStartedPulling="2026-03-13 21:14:06.337796709 +0000 UTC m=+2777.358912590" lastFinishedPulling="2026-03-13 21:14:11.027083627 +0000 UTC m=+2782.048199518" observedRunningTime="2026-03-13 21:14:11.423573723 +0000 UTC m=+2782.444689614" watchObservedRunningTime="2026-03-13 21:14:11.430807037 +0000 UTC m=+2782.451922928" Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.016156 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.016661 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.758110 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:14 crc kubenswrapper[4790]: I0313 21:14:14.758534 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:15 crc kubenswrapper[4790]: I0313 21:14:15.802633 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" probeResult="failure" output=< Mar 13 21:14:15 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:14:15 crc kubenswrapper[4790]: > Mar 13 21:14:18 crc kubenswrapper[4790]: I0313 21:14:18.831658 4790 scope.go:117] "RemoveContainer" containerID="bdca2f8da697e12973555a54d7d0753abfb943fd0d2919dd4adb4178a3e9c052" Mar 13 21:14:25 crc kubenswrapper[4790]: I0313 21:14:25.806437 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" probeResult="failure" output=< Mar 13 21:14:25 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:14:25 crc kubenswrapper[4790]: > Mar 13 21:14:35 crc kubenswrapper[4790]: I0313 21:14:35.800953 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" probeResult="failure" output=< Mar 13 21:14:35 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:14:35 crc kubenswrapper[4790]: > Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.016257 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.016877 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.016938 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.017747 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.017821 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795" gracePeriod=600 Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.684920 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795" exitCode=0 Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.684990 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795"} Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.685316 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36"} Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.685339 4790 scope.go:117] "RemoveContainer" containerID="23f0b9fb18b38de1beaed2d1c28a89e3450b5183e402dddf6d00d598b61c0bd5" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.808643 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:44 crc kubenswrapper[4790]: I0313 21:14:44.864700 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:45 crc kubenswrapper[4790]: I0313 21:14:45.047226 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:46 crc kubenswrapper[4790]: I0313 21:14:46.702165 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-v9cjw" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" containerID="cri-o://8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" gracePeriod=2 Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.201171 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.369883 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") pod \"f31292a3-5896-4bab-bd6e-bc45dffabc58\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.370018 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") pod \"f31292a3-5896-4bab-bd6e-bc45dffabc58\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.370133 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") pod \"f31292a3-5896-4bab-bd6e-bc45dffabc58\" (UID: \"f31292a3-5896-4bab-bd6e-bc45dffabc58\") " Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.370807 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities" (OuterVolumeSpecName: "utilities") pod "f31292a3-5896-4bab-bd6e-bc45dffabc58" (UID: "f31292a3-5896-4bab-bd6e-bc45dffabc58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.377460 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29" (OuterVolumeSpecName: "kube-api-access-dhc29") pod "f31292a3-5896-4bab-bd6e-bc45dffabc58" (UID: "f31292a3-5896-4bab-bd6e-bc45dffabc58"). InnerVolumeSpecName "kube-api-access-dhc29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.472190 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.472230 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhc29\" (UniqueName: \"kubernetes.io/projected/f31292a3-5896-4bab-bd6e-bc45dffabc58-kube-api-access-dhc29\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.506336 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f31292a3-5896-4bab-bd6e-bc45dffabc58" (UID: "f31292a3-5896-4bab-bd6e-bc45dffabc58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.573511 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f31292a3-5896-4bab-bd6e-bc45dffabc58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735098 4790 generic.go:334] "Generic (PLEG): container finished" podID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" exitCode=0 Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735509 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a"} Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735546 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-v9cjw" event={"ID":"f31292a3-5896-4bab-bd6e-bc45dffabc58","Type":"ContainerDied","Data":"37d2392ec75de9669244c14c79a003ddd20c18d7c108d9be1229bc52a91bd272"} Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735567 4790 scope.go:117] "RemoveContainer" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.735838 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-v9cjw" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.769978 4790 scope.go:117] "RemoveContainer" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.771234 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.799268 4790 scope.go:117] "RemoveContainer" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.802394 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-v9cjw"] Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.849884 4790 scope.go:117] "RemoveContainer" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" Mar 13 21:14:47 crc kubenswrapper[4790]: E0313 21:14:47.851030 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a\": container with ID starting with 8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a not found: ID does not exist" containerID="8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.851101 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a"} err="failed to get container status \"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a\": rpc error: code = NotFound desc = could not find container \"8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a\": container with ID starting with 8be056ca1062e1ac2dc981c84a2e76e511705be0c9f7cb67dbd1f7a4c0d2a01a not found: ID does not exist" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.851156 4790 scope.go:117] "RemoveContainer" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" Mar 13 21:14:47 crc kubenswrapper[4790]: E0313 21:14:47.852036 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac\": container with ID starting with d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac not found: ID does not exist" containerID="d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.852063 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac"} err="failed to get container status \"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac\": rpc error: code = NotFound desc = could not find container \"d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac\": container with ID starting with d97be052caddddb2e788e73c83d53db5c1e8f2188f25ebd115b6cc2b51cea8ac not found: ID does not exist" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.852079 4790 scope.go:117] "RemoveContainer" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" Mar 13 21:14:47 crc kubenswrapper[4790]: E0313 21:14:47.852829 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e\": container with ID starting with fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e not found: ID does not exist" containerID="fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e" Mar 13 21:14:47 crc kubenswrapper[4790]: I0313 21:14:47.852875 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e"} err="failed to get container status \"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e\": rpc error: code = NotFound desc = could not find container \"fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e\": container with ID starting with fbc979b0d27b069657241118bbbe542449b826b47099ff8b3720cdaabc9eea9e not found: ID does not exist" Mar 13 21:14:49 crc kubenswrapper[4790]: I0313 21:14:49.678124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" path="/var/lib/kubelet/pods/f31292a3-5896-4bab-bd6e-bc45dffabc58/volumes" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.138939 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j"] Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.139879 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" containerName="oc" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.139898 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" containerName="oc" Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.139937 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-utilities" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.139946 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-utilities" Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.139983 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-content" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.139991 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="extract-content" Mar 13 21:15:00 crc kubenswrapper[4790]: E0313 21:15:00.140001 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.140009 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.140218 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f31292a3-5896-4bab-bd6e-bc45dffabc58" containerName="registry-server" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.140255 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" containerName="oc" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.141100 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.143129 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.144025 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.163825 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j"] Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.305089 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.305258 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.305456 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.407589 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.407668 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.407740 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.408597 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.413149 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.423015 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"collect-profiles-29557275-nxf7j\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.465031 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:00 crc kubenswrapper[4790]: I0313 21:15:00.904021 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j"] Mar 13 21:15:01 crc kubenswrapper[4790]: I0313 21:15:01.866241 4790 generic.go:334] "Generic (PLEG): container finished" podID="d955a3c8-0b10-4040-8fc8-043862800b24" containerID="51a0b2215403cc456953ad266cc3365cf4785481bce1dd6c596170361ac34e20" exitCode=0 Mar 13 21:15:01 crc kubenswrapper[4790]: I0313 21:15:01.866311 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" event={"ID":"d955a3c8-0b10-4040-8fc8-043862800b24","Type":"ContainerDied","Data":"51a0b2215403cc456953ad266cc3365cf4785481bce1dd6c596170361ac34e20"} Mar 13 21:15:01 crc kubenswrapper[4790]: I0313 21:15:01.866582 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" event={"ID":"d955a3c8-0b10-4040-8fc8-043862800b24","Type":"ContainerStarted","Data":"356d142c1c345b4cc1b5d0c6e53ffc8fb48ebe85a954a45f5c6fda7d8f27ad0d"} Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.348857 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.464266 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") pod \"d955a3c8-0b10-4040-8fc8-043862800b24\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.464623 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") pod \"d955a3c8-0b10-4040-8fc8-043862800b24\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.464759 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") pod \"d955a3c8-0b10-4040-8fc8-043862800b24\" (UID: \"d955a3c8-0b10-4040-8fc8-043862800b24\") " Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.465125 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume" (OuterVolumeSpecName: "config-volume") pod "d955a3c8-0b10-4040-8fc8-043862800b24" (UID: "d955a3c8-0b10-4040-8fc8-043862800b24"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.465384 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d955a3c8-0b10-4040-8fc8-043862800b24-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.469629 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx" (OuterVolumeSpecName: "kube-api-access-2jdpx") pod "d955a3c8-0b10-4040-8fc8-043862800b24" (UID: "d955a3c8-0b10-4040-8fc8-043862800b24"). InnerVolumeSpecName "kube-api-access-2jdpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.470569 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d955a3c8-0b10-4040-8fc8-043862800b24" (UID: "d955a3c8-0b10-4040-8fc8-043862800b24"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.567142 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d955a3c8-0b10-4040-8fc8-043862800b24-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.567183 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdpx\" (UniqueName: \"kubernetes.io/projected/d955a3c8-0b10-4040-8fc8-043862800b24-kube-api-access-2jdpx\") on node \"crc\" DevicePath \"\"" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.885183 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" event={"ID":"d955a3c8-0b10-4040-8fc8-043862800b24","Type":"ContainerDied","Data":"356d142c1c345b4cc1b5d0c6e53ffc8fb48ebe85a954a45f5c6fda7d8f27ad0d"} Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.885231 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="356d142c1c345b4cc1b5d0c6e53ffc8fb48ebe85a954a45f5c6fda7d8f27ad0d" Mar 13 21:15:03 crc kubenswrapper[4790]: I0313 21:15:03.885241 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557275-nxf7j" Mar 13 21:15:04 crc kubenswrapper[4790]: I0313 21:15:04.425149 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 21:15:04 crc kubenswrapper[4790]: I0313 21:15:04.434190 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557230-rjmvn"] Mar 13 21:15:05 crc kubenswrapper[4790]: I0313 21:15:05.671453 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87e4f09f-d19e-4b0a-85b2-636b5ce5ef51" path="/var/lib/kubelet/pods/87e4f09f-d19e-4b0a-85b2-636b5ce5ef51/volumes" Mar 13 21:15:18 crc kubenswrapper[4790]: I0313 21:15:18.916371 4790 scope.go:117] "RemoveContainer" containerID="93c1f10337c2883de8c80150a75f7613328eeffafc6c4c7570ee71639cf9048a" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.147163 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:16:00 crc kubenswrapper[4790]: E0313 21:16:00.148201 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d955a3c8-0b10-4040-8fc8-043862800b24" containerName="collect-profiles" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.148217 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="d955a3c8-0b10-4040-8fc8-043862800b24" containerName="collect-profiles" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.148460 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="d955a3c8-0b10-4040-8fc8-043862800b24" containerName="collect-profiles" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.149218 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.152082 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.152613 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.153642 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.158191 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.183955 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"auto-csr-approver-29557276-9dtrx\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.287422 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"auto-csr-approver-29557276-9dtrx\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.305808 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"auto-csr-approver-29557276-9dtrx\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.468038 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:00 crc kubenswrapper[4790]: I0313 21:16:00.938662 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:16:01 crc kubenswrapper[4790]: I0313 21:16:01.378605 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerStarted","Data":"8e891209a6d777a268d6770601327cf5c9c35bc17f362059f286b01cbcf4ab2b"} Mar 13 21:16:02 crc kubenswrapper[4790]: I0313 21:16:02.393880 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerStarted","Data":"1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a"} Mar 13 21:16:02 crc kubenswrapper[4790]: I0313 21:16:02.413791 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" podStartSLOduration=1.277003166 podStartE2EDuration="2.413772539s" podCreationTimestamp="2026-03-13 21:16:00 +0000 UTC" firstStartedPulling="2026-03-13 21:16:00.946804299 +0000 UTC m=+2891.967920190" lastFinishedPulling="2026-03-13 21:16:02.083573672 +0000 UTC m=+2893.104689563" observedRunningTime="2026-03-13 21:16:02.412143634 +0000 UTC m=+2893.433259535" watchObservedRunningTime="2026-03-13 21:16:02.413772539 +0000 UTC m=+2893.434888430" Mar 13 21:16:03 crc kubenswrapper[4790]: I0313 21:16:03.404031 4790 generic.go:334] "Generic (PLEG): container finished" podID="1f2961af-f195-403c-bfa2-fd01638789d4" containerID="1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a" exitCode=0 Mar 13 21:16:03 crc kubenswrapper[4790]: I0313 21:16:03.404079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerDied","Data":"1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a"} Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.845746 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.874339 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") pod \"1f2961af-f195-403c-bfa2-fd01638789d4\" (UID: \"1f2961af-f195-403c-bfa2-fd01638789d4\") " Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.881090 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5" (OuterVolumeSpecName: "kube-api-access-zvcf5") pod "1f2961af-f195-403c-bfa2-fd01638789d4" (UID: "1f2961af-f195-403c-bfa2-fd01638789d4"). InnerVolumeSpecName "kube-api-access-zvcf5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:16:04 crc kubenswrapper[4790]: I0313 21:16:04.976228 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvcf5\" (UniqueName: \"kubernetes.io/projected/1f2961af-f195-403c-bfa2-fd01638789d4-kube-api-access-zvcf5\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.423894 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" event={"ID":"1f2961af-f195-403c-bfa2-fd01638789d4","Type":"ContainerDied","Data":"8e891209a6d777a268d6770601327cf5c9c35bc17f362059f286b01cbcf4ab2b"} Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.423930 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e891209a6d777a268d6770601327cf5c9c35bc17f362059f286b01cbcf4ab2b" Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.423967 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557276-9dtrx" Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.485425 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.492843 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557270-xndpr"] Mar 13 21:16:05 crc kubenswrapper[4790]: I0313 21:16:05.669655 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68f751f6-8e31-448a-99e9-bf7f290684be" path="/var/lib/kubelet/pods/68f751f6-8e31-448a-99e9-bf7f290684be/volumes" Mar 13 21:16:18 crc kubenswrapper[4790]: I0313 21:16:18.996511 4790 scope.go:117] "RemoveContainer" containerID="231c0b730759ce0ec6fa00fad0e521d17888055794c0179d0b2c116cf68aaf15" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.137211 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:30 crc kubenswrapper[4790]: E0313 21:16:30.138272 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" containerName="oc" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.138290 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" containerName="oc" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.138554 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" containerName="oc" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.140162 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.154123 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.192942 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.193094 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.193134 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.294487 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.294827 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.294921 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.295031 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.295089 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.314271 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"redhat-marketplace-mngmq\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.477262 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:30 crc kubenswrapper[4790]: I0313 21:16:30.982509 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:31 crc kubenswrapper[4790]: I0313 21:16:31.670463 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" exitCode=0 Mar 13 21:16:31 crc kubenswrapper[4790]: I0313 21:16:31.694659 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15"} Mar 13 21:16:31 crc kubenswrapper[4790]: I0313 21:16:31.694697 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerStarted","Data":"a3eae7ae0ac98bea414db557f54a72878ac40df96b07b869c33dd07262bd97bc"} Mar 13 21:16:32 crc kubenswrapper[4790]: I0313 21:16:32.681090 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" exitCode=0 Mar 13 21:16:32 crc kubenswrapper[4790]: I0313 21:16:32.681205 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f"} Mar 13 21:16:33 crc kubenswrapper[4790]: I0313 21:16:33.696908 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerStarted","Data":"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec"} Mar 13 21:16:33 crc kubenswrapper[4790]: I0313 21:16:33.721335 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mngmq" podStartSLOduration=2.319261492 podStartE2EDuration="3.721312353s" podCreationTimestamp="2026-03-13 21:16:30 +0000 UTC" firstStartedPulling="2026-03-13 21:16:31.679864235 +0000 UTC m=+2922.700980126" lastFinishedPulling="2026-03-13 21:16:33.081915096 +0000 UTC m=+2924.103030987" observedRunningTime="2026-03-13 21:16:33.71357336 +0000 UTC m=+2924.734689251" watchObservedRunningTime="2026-03-13 21:16:33.721312353 +0000 UTC m=+2924.742428244" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.477629 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.478258 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.527145 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.826419 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:40 crc kubenswrapper[4790]: I0313 21:16:40.876200 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:42 crc kubenswrapper[4790]: I0313 21:16:42.798115 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mngmq" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" containerID="cri-o://6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" gracePeriod=2 Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.267881 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.354358 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") pod \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.354535 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") pod \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.354663 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") pod \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\" (UID: \"bd4a25b0-b765-448b-aebc-895c1e6a13ce\") " Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.358173 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities" (OuterVolumeSpecName: "utilities") pod "bd4a25b0-b765-448b-aebc-895c1e6a13ce" (UID: "bd4a25b0-b765-448b-aebc-895c1e6a13ce"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.366561 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw" (OuterVolumeSpecName: "kube-api-access-wh4pw") pod "bd4a25b0-b765-448b-aebc-895c1e6a13ce" (UID: "bd4a25b0-b765-448b-aebc-895c1e6a13ce"). InnerVolumeSpecName "kube-api-access-wh4pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.457441 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.457490 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wh4pw\" (UniqueName: \"kubernetes.io/projected/bd4a25b0-b765-448b-aebc-895c1e6a13ce-kube-api-access-wh4pw\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807654 4790 generic.go:334] "Generic (PLEG): container finished" podID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" exitCode=0 Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807809 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec"} Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807952 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mngmq" event={"ID":"bd4a25b0-b765-448b-aebc-895c1e6a13ce","Type":"ContainerDied","Data":"a3eae7ae0ac98bea414db557f54a72878ac40df96b07b869c33dd07262bd97bc"} Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807976 4790 scope.go:117] "RemoveContainer" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.807898 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mngmq" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.832953 4790 scope.go:117] "RemoveContainer" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.851951 4790 scope.go:117] "RemoveContainer" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.874805 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bd4a25b0-b765-448b-aebc-895c1e6a13ce" (UID: "bd4a25b0-b765-448b-aebc-895c1e6a13ce"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.897736 4790 scope.go:117] "RemoveContainer" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" Mar 13 21:16:43 crc kubenswrapper[4790]: E0313 21:16:43.898110 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec\": container with ID starting with 6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec not found: ID does not exist" containerID="6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898143 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec"} err="failed to get container status \"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec\": rpc error: code = NotFound desc = could not find container \"6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec\": container with ID starting with 6d5d0f7cecfa49308366ba8a7642971b9fc6294f824879c29dcc9c08414d08ec not found: ID does not exist" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898161 4790 scope.go:117] "RemoveContainer" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" Mar 13 21:16:43 crc kubenswrapper[4790]: E0313 21:16:43.898622 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f\": container with ID starting with 5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f not found: ID does not exist" containerID="5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898672 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f"} err="failed to get container status \"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f\": rpc error: code = NotFound desc = could not find container \"5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f\": container with ID starting with 5902ecfb9dff503c053ca33a53874026ebf550887f272fa99bb7c65eddd1071f not found: ID does not exist" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.898708 4790 scope.go:117] "RemoveContainer" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" Mar 13 21:16:43 crc kubenswrapper[4790]: E0313 21:16:43.899005 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15\": container with ID starting with 09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15 not found: ID does not exist" containerID="09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.899031 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15"} err="failed to get container status \"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15\": rpc error: code = NotFound desc = could not find container \"09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15\": container with ID starting with 09936d0bd181cb6ade2618ada68af7a27e78e9bbf621ffa032928d059e4afe15 not found: ID does not exist" Mar 13 21:16:43 crc kubenswrapper[4790]: I0313 21:16:43.967444 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bd4a25b0-b765-448b-aebc-895c1e6a13ce-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.015531 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.015585 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.139135 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:44 crc kubenswrapper[4790]: I0313 21:16:44.149800 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mngmq"] Mar 13 21:16:45 crc kubenswrapper[4790]: I0313 21:16:45.671271 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" path="/var/lib/kubelet/pods/bd4a25b0-b765-448b-aebc-895c1e6a13ce/volumes" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.556074 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:05 crc kubenswrapper[4790]: E0313 21:17:05.557111 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-utilities" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557134 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-utilities" Mar 13 21:17:05 crc kubenswrapper[4790]: E0313 21:17:05.557156 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-content" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557164 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="extract-content" Mar 13 21:17:05 crc kubenswrapper[4790]: E0313 21:17:05.557197 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557204 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.557453 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd4a25b0-b765-448b-aebc-895c1e6a13ce" containerName="registry-server" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.559117 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.567004 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.661399 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.661487 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.661624 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.763543 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.763643 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.763693 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.764155 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.764921 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.789310 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"certified-operators-rvlm5\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:05 crc kubenswrapper[4790]: I0313 21:17:05.880632 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:06 crc kubenswrapper[4790]: I0313 21:17:06.390443 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.000778 4790 generic.go:334] "Generic (PLEG): container finished" podID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" exitCode=0 Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.000815 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b"} Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.000846 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerStarted","Data":"6b0fb3482331fe9ebb6936e0aa0a4fc44603cc49316d776e27cc517752a36fee"} Mar 13 21:17:07 crc kubenswrapper[4790]: I0313 21:17:07.002537 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:17:08 crc kubenswrapper[4790]: I0313 21:17:08.010736 4790 generic.go:334] "Generic (PLEG): container finished" podID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" exitCode=0 Mar 13 21:17:08 crc kubenswrapper[4790]: I0313 21:17:08.010852 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e"} Mar 13 21:17:09 crc kubenswrapper[4790]: I0313 21:17:09.022111 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerStarted","Data":"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42"} Mar 13 21:17:09 crc kubenswrapper[4790]: I0313 21:17:09.040808 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rvlm5" podStartSLOduration=2.609373885 podStartE2EDuration="4.040787341s" podCreationTimestamp="2026-03-13 21:17:05 +0000 UTC" firstStartedPulling="2026-03-13 21:17:07.002298673 +0000 UTC m=+2958.023414554" lastFinishedPulling="2026-03-13 21:17:08.433712119 +0000 UTC m=+2959.454828010" observedRunningTime="2026-03-13 21:17:09.03929218 +0000 UTC m=+2960.060408071" watchObservedRunningTime="2026-03-13 21:17:09.040787341 +0000 UTC m=+2960.061903242" Mar 13 21:17:14 crc kubenswrapper[4790]: I0313 21:17:14.016047 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:17:14 crc kubenswrapper[4790]: I0313 21:17:14.016517 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:17:15 crc kubenswrapper[4790]: I0313 21:17:15.881817 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:15 crc kubenswrapper[4790]: I0313 21:17:15.882187 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:15 crc kubenswrapper[4790]: I0313 21:17:15.928859 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:16 crc kubenswrapper[4790]: I0313 21:17:16.121240 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:16 crc kubenswrapper[4790]: I0313 21:17:16.163276 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.093190 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rvlm5" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" containerID="cri-o://11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" gracePeriod=2 Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.600454 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.699408 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") pod \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.699771 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") pod \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.699958 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") pod \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\" (UID: \"848a94d0-7273-4bd8-a9a3-37a0c83d021d\") " Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.700973 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities" (OuterVolumeSpecName: "utilities") pod "848a94d0-7273-4bd8-a9a3-37a0c83d021d" (UID: "848a94d0-7273-4bd8-a9a3-37a0c83d021d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.710902 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6" (OuterVolumeSpecName: "kube-api-access-zkpd6") pod "848a94d0-7273-4bd8-a9a3-37a0c83d021d" (UID: "848a94d0-7273-4bd8-a9a3-37a0c83d021d"). InnerVolumeSpecName "kube-api-access-zkpd6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.749540 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "848a94d0-7273-4bd8-a9a3-37a0c83d021d" (UID: "848a94d0-7273-4bd8-a9a3-37a0c83d021d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.802548 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.804283 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/848a94d0-7273-4bd8-a9a3-37a0c83d021d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:17:18 crc kubenswrapper[4790]: I0313 21:17:18.804458 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkpd6\" (UniqueName: \"kubernetes.io/projected/848a94d0-7273-4bd8-a9a3-37a0c83d021d-kube-api-access-zkpd6\") on node \"crc\" DevicePath \"\"" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.102775 4790 generic.go:334] "Generic (PLEG): container finished" podID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" exitCode=0 Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.102856 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rvlm5" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.102868 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42"} Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.104025 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rvlm5" event={"ID":"848a94d0-7273-4bd8-a9a3-37a0c83d021d","Type":"ContainerDied","Data":"6b0fb3482331fe9ebb6936e0aa0a4fc44603cc49316d776e27cc517752a36fee"} Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.104044 4790 scope.go:117] "RemoveContainer" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.123069 4790 scope.go:117] "RemoveContainer" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.145245 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.155682 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rvlm5"] Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.169370 4790 scope.go:117] "RemoveContainer" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.196258 4790 scope.go:117] "RemoveContainer" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" Mar 13 21:17:19 crc kubenswrapper[4790]: E0313 21:17:19.196720 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42\": container with ID starting with 11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42 not found: ID does not exist" containerID="11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.196832 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42"} err="failed to get container status \"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42\": rpc error: code = NotFound desc = could not find container \"11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42\": container with ID starting with 11ebaf6e25486618bceb645369b83dc4e0e9143de6846678e23b5044c2fe8d42 not found: ID does not exist" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.196913 4790 scope.go:117] "RemoveContainer" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" Mar 13 21:17:19 crc kubenswrapper[4790]: E0313 21:17:19.197497 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e\": container with ID starting with 9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e not found: ID does not exist" containerID="9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.197608 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e"} err="failed to get container status \"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e\": rpc error: code = NotFound desc = could not find container \"9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e\": container with ID starting with 9fd47b72534179044f5f7a1a338a235cdc625a54ec35157007a57d9ea25ee65e not found: ID does not exist" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.197671 4790 scope.go:117] "RemoveContainer" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" Mar 13 21:17:19 crc kubenswrapper[4790]: E0313 21:17:19.197971 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b\": container with ID starting with 6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b not found: ID does not exist" containerID="6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.198005 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b"} err="failed to get container status \"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b\": rpc error: code = NotFound desc = could not find container \"6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b\": container with ID starting with 6940446cbfb31e7ecbec55ccf29dd9b2ccbde4bc5cd9721825330a7879a2859b not found: ID does not exist" Mar 13 21:17:19 crc kubenswrapper[4790]: I0313 21:17:19.682322 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" path="/var/lib/kubelet/pods/848a94d0-7273-4bd8-a9a3-37a0c83d021d/volumes" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.015484 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.016070 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.016121 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.016943 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.017001 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" gracePeriod=600 Mar 13 21:17:44 crc kubenswrapper[4790]: E0313 21:17:44.139601 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317278 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" exitCode=0 Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317321 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36"} Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317358 4790 scope.go:117] "RemoveContainer" containerID="75f331721e6162201038d479ba2bbbbd3f6476b2bf5be1d38a4c2de09e217795" Mar 13 21:17:44 crc kubenswrapper[4790]: I0313 21:17:44.317975 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:17:44 crc kubenswrapper[4790]: E0313 21:17:44.318223 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:17:58 crc kubenswrapper[4790]: I0313 21:17:58.659697 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:17:58 crc kubenswrapper[4790]: E0313 21:17:58.660540 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.143219 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:18:00 crc kubenswrapper[4790]: E0313 21:18:00.144005 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-utilities" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144028 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-utilities" Mar 13 21:18:00 crc kubenswrapper[4790]: E0313 21:18:00.144058 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144070 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" Mar 13 21:18:00 crc kubenswrapper[4790]: E0313 21:18:00.144093 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-content" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144106 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="extract-content" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.144432 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="848a94d0-7273-4bd8-a9a3-37a0c83d021d" containerName="registry-server" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.145236 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.147553 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.147632 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.147693 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.160356 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.299165 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"auto-csr-approver-29557278-6n5t9\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.401744 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"auto-csr-approver-29557278-6n5t9\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.426256 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"auto-csr-approver-29557278-6n5t9\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.465396 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:00 crc kubenswrapper[4790]: I0313 21:18:00.895732 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:18:01 crc kubenswrapper[4790]: I0313 21:18:01.470017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" event={"ID":"4d317138-8c0e-4824-a6b0-c25bb9b79631","Type":"ContainerStarted","Data":"adc434aa7408eebe3d38ea736ed275008af1dcbea2dae8dadcd58a62aa08bd3c"} Mar 13 21:18:02 crc kubenswrapper[4790]: I0313 21:18:02.478624 4790 generic.go:334] "Generic (PLEG): container finished" podID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerID="9bdff7a81ff2a9a8995b79476629c9294b76419c09baf5ddb2aac9365620522e" exitCode=0 Mar 13 21:18:02 crc kubenswrapper[4790]: I0313 21:18:02.478687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" event={"ID":"4d317138-8c0e-4824-a6b0-c25bb9b79631","Type":"ContainerDied","Data":"9bdff7a81ff2a9a8995b79476629c9294b76419c09baf5ddb2aac9365620522e"} Mar 13 21:18:03 crc kubenswrapper[4790]: I0313 21:18:03.843228 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:03 crc kubenswrapper[4790]: I0313 21:18:03.975649 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") pod \"4d317138-8c0e-4824-a6b0-c25bb9b79631\" (UID: \"4d317138-8c0e-4824-a6b0-c25bb9b79631\") " Mar 13 21:18:03 crc kubenswrapper[4790]: I0313 21:18:03.983945 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9" (OuterVolumeSpecName: "kube-api-access-7l2c9") pod "4d317138-8c0e-4824-a6b0-c25bb9b79631" (UID: "4d317138-8c0e-4824-a6b0-c25bb9b79631"). InnerVolumeSpecName "kube-api-access-7l2c9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.078613 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7l2c9\" (UniqueName: \"kubernetes.io/projected/4d317138-8c0e-4824-a6b0-c25bb9b79631-kube-api-access-7l2c9\") on node \"crc\" DevicePath \"\"" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.499957 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" event={"ID":"4d317138-8c0e-4824-a6b0-c25bb9b79631","Type":"ContainerDied","Data":"adc434aa7408eebe3d38ea736ed275008af1dcbea2dae8dadcd58a62aa08bd3c"} Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.500011 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adc434aa7408eebe3d38ea736ed275008af1dcbea2dae8dadcd58a62aa08bd3c" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.500010 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557278-6n5t9" Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.907601 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:18:04 crc kubenswrapper[4790]: I0313 21:18:04.916955 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557272-cdqgf"] Mar 13 21:18:05 crc kubenswrapper[4790]: I0313 21:18:05.669124 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b51130e-f39a-4a4c-b41e-c865e51004dd" path="/var/lib/kubelet/pods/7b51130e-f39a-4a4c-b41e-c865e51004dd/volumes" Mar 13 21:18:13 crc kubenswrapper[4790]: I0313 21:18:13.660537 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:13 crc kubenswrapper[4790]: E0313 21:18:13.661424 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:19 crc kubenswrapper[4790]: I0313 21:18:19.101243 4790 scope.go:117] "RemoveContainer" containerID="4ea25a336829635b84ca0d8e478c73129cc595166d50214c193658a79404456f" Mar 13 21:18:24 crc kubenswrapper[4790]: I0313 21:18:24.659931 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:24 crc kubenswrapper[4790]: E0313 21:18:24.660731 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:39 crc kubenswrapper[4790]: I0313 21:18:39.666207 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:39 crc kubenswrapper[4790]: E0313 21:18:39.666770 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:18:54 crc kubenswrapper[4790]: I0313 21:18:54.659805 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:18:54 crc kubenswrapper[4790]: E0313 21:18:54.660710 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:05 crc kubenswrapper[4790]: I0313 21:19:05.660645 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:05 crc kubenswrapper[4790]: E0313 21:19:05.661432 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:16 crc kubenswrapper[4790]: I0313 21:19:16.659895 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:16 crc kubenswrapper[4790]: E0313 21:19:16.660752 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:31 crc kubenswrapper[4790]: I0313 21:19:31.659568 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:31 crc kubenswrapper[4790]: E0313 21:19:31.660503 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:43 crc kubenswrapper[4790]: I0313 21:19:43.660413 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:43 crc kubenswrapper[4790]: E0313 21:19:43.661336 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:19:58 crc kubenswrapper[4790]: I0313 21:19:58.660241 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:19:58 crc kubenswrapper[4790]: E0313 21:19:58.660974 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.140494 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:20:00 crc kubenswrapper[4790]: E0313 21:20:00.141221 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerName="oc" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.141239 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerName="oc" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.141473 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" containerName="oc" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.142128 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.144196 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.144598 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.144828 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.152748 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.225363 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"auto-csr-approver-29557280-f6wtq\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.327069 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"auto-csr-approver-29557280-f6wtq\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.349587 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"auto-csr-approver-29557280-f6wtq\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.463245 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:00 crc kubenswrapper[4790]: I0313 21:20:00.917869 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:20:01 crc kubenswrapper[4790]: I0313 21:20:01.454989 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" event={"ID":"c215866d-1e07-4033-8e8a-d7826692bc76","Type":"ContainerStarted","Data":"2c754481c726d5c0d95efcd226007d3a00ebe6f209b3c71e8b4439a056f23e8c"} Mar 13 21:20:03 crc kubenswrapper[4790]: I0313 21:20:03.478825 4790 generic.go:334] "Generic (PLEG): container finished" podID="c215866d-1e07-4033-8e8a-d7826692bc76" containerID="1dfb1a39dcbf9770c39e6abee624c19e7caa14a0b69762f480ec12e76586b37f" exitCode=0 Mar 13 21:20:03 crc kubenswrapper[4790]: I0313 21:20:03.478932 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" event={"ID":"c215866d-1e07-4033-8e8a-d7826692bc76","Type":"ContainerDied","Data":"1dfb1a39dcbf9770c39e6abee624c19e7caa14a0b69762f480ec12e76586b37f"} Mar 13 21:20:04 crc kubenswrapper[4790]: I0313 21:20:04.858157 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.015334 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") pod \"c215866d-1e07-4033-8e8a-d7826692bc76\" (UID: \"c215866d-1e07-4033-8e8a-d7826692bc76\") " Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.021445 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5" (OuterVolumeSpecName: "kube-api-access-zjnb5") pod "c215866d-1e07-4033-8e8a-d7826692bc76" (UID: "c215866d-1e07-4033-8e8a-d7826692bc76"). InnerVolumeSpecName "kube-api-access-zjnb5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.117355 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjnb5\" (UniqueName: \"kubernetes.io/projected/c215866d-1e07-4033-8e8a-d7826692bc76-kube-api-access-zjnb5\") on node \"crc\" DevicePath \"\"" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.503033 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" event={"ID":"c215866d-1e07-4033-8e8a-d7826692bc76","Type":"ContainerDied","Data":"2c754481c726d5c0d95efcd226007d3a00ebe6f209b3c71e8b4439a056f23e8c"} Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.503310 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c754481c726d5c0d95efcd226007d3a00ebe6f209b3c71e8b4439a056f23e8c" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.503104 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557280-f6wtq" Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.939852 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:20:05 crc kubenswrapper[4790]: I0313 21:20:05.953751 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557274-6gbsg"] Mar 13 21:20:07 crc kubenswrapper[4790]: I0313 21:20:07.672002 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="657afd7a-d901-4df2-96d4-239bf59388bd" path="/var/lib/kubelet/pods/657afd7a-d901-4df2-96d4-239bf59388bd/volumes" Mar 13 21:20:13 crc kubenswrapper[4790]: I0313 21:20:13.660477 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:13 crc kubenswrapper[4790]: E0313 21:20:13.661320 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:19 crc kubenswrapper[4790]: I0313 21:20:19.212513 4790 scope.go:117] "RemoveContainer" containerID="d277a0373c5a7461ab377865cd1179cec1bb76b46da5d05b6de42a92acf13b80" Mar 13 21:20:25 crc kubenswrapper[4790]: I0313 21:20:25.659824 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:25 crc kubenswrapper[4790]: E0313 21:20:25.660663 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:37 crc kubenswrapper[4790]: I0313 21:20:37.660401 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:37 crc kubenswrapper[4790]: E0313 21:20:37.661211 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:20:52 crc kubenswrapper[4790]: I0313 21:20:52.661438 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:20:52 crc kubenswrapper[4790]: E0313 21:20:52.662246 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:06 crc kubenswrapper[4790]: I0313 21:21:06.660705 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:06 crc kubenswrapper[4790]: E0313 21:21:06.661639 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:17 crc kubenswrapper[4790]: I0313 21:21:17.660113 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:17 crc kubenswrapper[4790]: E0313 21:21:17.660859 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:28 crc kubenswrapper[4790]: I0313 21:21:28.661259 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:28 crc kubenswrapper[4790]: E0313 21:21:28.662066 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:43 crc kubenswrapper[4790]: I0313 21:21:43.660703 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:43 crc kubenswrapper[4790]: E0313 21:21:43.661659 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:21:57 crc kubenswrapper[4790]: I0313 21:21:57.660124 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:21:57 crc kubenswrapper[4790]: E0313 21:21:57.661737 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.140627 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:22:00 crc kubenswrapper[4790]: E0313 21:22:00.141879 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" containerName="oc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.141899 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" containerName="oc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.142126 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" containerName="oc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.142875 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.145057 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.145334 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.145579 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.151918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.190729 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"auto-csr-approver-29557282-h6blc\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.292745 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"auto-csr-approver-29557282-h6blc\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.311967 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"auto-csr-approver-29557282-h6blc\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.511156 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:00 crc kubenswrapper[4790]: I0313 21:22:00.990323 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:22:01 crc kubenswrapper[4790]: I0313 21:22:01.379757 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerStarted","Data":"8a86fcaafe9bdb23070e68c0a31f33e4af357b0443043182911a908590e57eb0"} Mar 13 21:22:03 crc kubenswrapper[4790]: I0313 21:22:03.397652 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerStarted","Data":"5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0"} Mar 13 21:22:03 crc kubenswrapper[4790]: I0313 21:22:03.409441 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557282-h6blc" podStartSLOduration=1.422196846 podStartE2EDuration="3.409423333s" podCreationTimestamp="2026-03-13 21:22:00 +0000 UTC" firstStartedPulling="2026-03-13 21:22:01.007885201 +0000 UTC m=+3252.029001092" lastFinishedPulling="2026-03-13 21:22:02.995111688 +0000 UTC m=+3254.016227579" observedRunningTime="2026-03-13 21:22:03.407738966 +0000 UTC m=+3254.428854847" watchObservedRunningTime="2026-03-13 21:22:03.409423333 +0000 UTC m=+3254.430539224" Mar 13 21:22:04 crc kubenswrapper[4790]: I0313 21:22:04.408968 4790 generic.go:334] "Generic (PLEG): container finished" podID="e4a5c228-86a3-4945-8d95-44db739406d7" containerID="5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0" exitCode=0 Mar 13 21:22:04 crc kubenswrapper[4790]: I0313 21:22:04.409049 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerDied","Data":"5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0"} Mar 13 21:22:05 crc kubenswrapper[4790]: I0313 21:22:05.826635 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:05 crc kubenswrapper[4790]: I0313 21:22:05.901863 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") pod \"e4a5c228-86a3-4945-8d95-44db739406d7\" (UID: \"e4a5c228-86a3-4945-8d95-44db739406d7\") " Mar 13 21:22:05 crc kubenswrapper[4790]: I0313 21:22:05.907797 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc" (OuterVolumeSpecName: "kube-api-access-h2gmc") pod "e4a5c228-86a3-4945-8d95-44db739406d7" (UID: "e4a5c228-86a3-4945-8d95-44db739406d7"). InnerVolumeSpecName "kube-api-access-h2gmc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.004843 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2gmc\" (UniqueName: \"kubernetes.io/projected/e4a5c228-86a3-4945-8d95-44db739406d7-kube-api-access-h2gmc\") on node \"crc\" DevicePath \"\"" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.433017 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557282-h6blc" event={"ID":"e4a5c228-86a3-4945-8d95-44db739406d7","Type":"ContainerDied","Data":"8a86fcaafe9bdb23070e68c0a31f33e4af357b0443043182911a908590e57eb0"} Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.433394 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a86fcaafe9bdb23070e68c0a31f33e4af357b0443043182911a908590e57eb0" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.433100 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557282-h6blc" Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.504319 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:22:06 crc kubenswrapper[4790]: I0313 21:22:06.516049 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557276-9dtrx"] Mar 13 21:22:07 crc kubenswrapper[4790]: I0313 21:22:07.670794 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f2961af-f195-403c-bfa2-fd01638789d4" path="/var/lib/kubelet/pods/1f2961af-f195-403c-bfa2-fd01638789d4/volumes" Mar 13 21:22:10 crc kubenswrapper[4790]: I0313 21:22:10.660726 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:10 crc kubenswrapper[4790]: E0313 21:22:10.661556 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:19 crc kubenswrapper[4790]: I0313 21:22:19.321275 4790 scope.go:117] "RemoveContainer" containerID="1ca2f5093d0685422bd455148422a18e13291d0f890d95cbb35dbff344da7e0a" Mar 13 21:22:22 crc kubenswrapper[4790]: I0313 21:22:22.660128 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:22 crc kubenswrapper[4790]: E0313 21:22:22.660770 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:34 crc kubenswrapper[4790]: I0313 21:22:34.660308 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:34 crc kubenswrapper[4790]: E0313 21:22:34.661071 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:22:49 crc kubenswrapper[4790]: I0313 21:22:49.668501 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:22:50 crc kubenswrapper[4790]: I0313 21:22:50.812114 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7"} Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.488344 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:31 crc kubenswrapper[4790]: E0313 21:23:31.489452 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" containerName="oc" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.489470 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" containerName="oc" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.489740 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" containerName="oc" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.494685 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.529918 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.653130 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-utilities\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.653205 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-catalog-content\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.653316 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtd2p\" (UniqueName: \"kubernetes.io/projected/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-kube-api-access-gtd2p\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.754844 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-catalog-content\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.754974 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtd2p\" (UniqueName: \"kubernetes.io/projected/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-kube-api-access-gtd2p\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.755134 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-utilities\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.755982 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-catalog-content\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.756693 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-utilities\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.779434 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtd2p\" (UniqueName: \"kubernetes.io/projected/2ab722e2-16ac-40ba-9c44-903bf6bb8db8-kube-api-access-gtd2p\") pod \"community-operators-qcdqx\" (UID: \"2ab722e2-16ac-40ba-9c44-903bf6bb8db8\") " pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:31 crc kubenswrapper[4790]: I0313 21:23:31.853443 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:32 crc kubenswrapper[4790]: I0313 21:23:32.156049 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.155404 4790 generic.go:334] "Generic (PLEG): container finished" podID="2ab722e2-16ac-40ba-9c44-903bf6bb8db8" containerID="c9e3387e3b57059a7c47d9d4c2339fa61974767686b0ed4a8f9abdc3174ca87d" exitCode=0 Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.155455 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerDied","Data":"c9e3387e3b57059a7c47d9d4c2339fa61974767686b0ed4a8f9abdc3174ca87d"} Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.156418 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerStarted","Data":"45d8ec9a6d95741fa5bd7264d04cc763f01fa0de802d313de423dceeb441c88d"} Mar 13 21:23:33 crc kubenswrapper[4790]: I0313 21:23:33.157778 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:23:37 crc kubenswrapper[4790]: I0313 21:23:37.190610 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerStarted","Data":"0a2b4475cb0e313c5d697bae625925d992fb7842bebfe72a72b3212d4e51639e"} Mar 13 21:23:38 crc kubenswrapper[4790]: I0313 21:23:38.204077 4790 generic.go:334] "Generic (PLEG): container finished" podID="2ab722e2-16ac-40ba-9c44-903bf6bb8db8" containerID="0a2b4475cb0e313c5d697bae625925d992fb7842bebfe72a72b3212d4e51639e" exitCode=0 Mar 13 21:23:38 crc kubenswrapper[4790]: I0313 21:23:38.204115 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerDied","Data":"0a2b4475cb0e313c5d697bae625925d992fb7842bebfe72a72b3212d4e51639e"} Mar 13 21:23:39 crc kubenswrapper[4790]: I0313 21:23:39.215079 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qcdqx" event={"ID":"2ab722e2-16ac-40ba-9c44-903bf6bb8db8","Type":"ContainerStarted","Data":"4ff8fa23aca6fe1e534c846faa65ea12116ad45ec2e0a9c13d4d0d98ed73111a"} Mar 13 21:23:39 crc kubenswrapper[4790]: I0313 21:23:39.236541 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qcdqx" podStartSLOduration=2.701455642 podStartE2EDuration="8.236518439s" podCreationTimestamp="2026-03-13 21:23:31 +0000 UTC" firstStartedPulling="2026-03-13 21:23:33.15754536 +0000 UTC m=+3344.178661241" lastFinishedPulling="2026-03-13 21:23:38.692608147 +0000 UTC m=+3349.713724038" observedRunningTime="2026-03-13 21:23:39.233705672 +0000 UTC m=+3350.254821593" watchObservedRunningTime="2026-03-13 21:23:39.236518439 +0000 UTC m=+3350.257634330" Mar 13 21:23:41 crc kubenswrapper[4790]: I0313 21:23:41.854151 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:41 crc kubenswrapper[4790]: I0313 21:23:41.854791 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:41 crc kubenswrapper[4790]: I0313 21:23:41.923358 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.298351 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qcdqx" Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.388532 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qcdqx"] Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.435723 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.435959 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-cpxlj" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" containerID="cri-o://1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" gracePeriod=2 Mar 13 21:23:43 crc kubenswrapper[4790]: I0313 21:23:43.920108 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.005015 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") pod \"62b23203-ded5-4b14-8a86-89c3ce3e33df\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.005094 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") pod \"62b23203-ded5-4b14-8a86-89c3ce3e33df\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.005190 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") pod \"62b23203-ded5-4b14-8a86-89c3ce3e33df\" (UID: \"62b23203-ded5-4b14-8a86-89c3ce3e33df\") " Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.007820 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities" (OuterVolumeSpecName: "utilities") pod "62b23203-ded5-4b14-8a86-89c3ce3e33df" (UID: "62b23203-ded5-4b14-8a86-89c3ce3e33df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.015958 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv" (OuterVolumeSpecName: "kube-api-access-z7qhv") pod "62b23203-ded5-4b14-8a86-89c3ce3e33df" (UID: "62b23203-ded5-4b14-8a86-89c3ce3e33df"). InnerVolumeSpecName "kube-api-access-z7qhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.076607 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62b23203-ded5-4b14-8a86-89c3ce3e33df" (UID: "62b23203-ded5-4b14-8a86-89c3ce3e33df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.107684 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7qhv\" (UniqueName: \"kubernetes.io/projected/62b23203-ded5-4b14-8a86-89c3ce3e33df-kube-api-access-z7qhv\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.107722 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.107734 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62b23203-ded5-4b14-8a86-89c3ce3e33df-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262238 4790 generic.go:334] "Generic (PLEG): container finished" podID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" exitCode=0 Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262293 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cpxlj" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262299 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5"} Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262354 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cpxlj" event={"ID":"62b23203-ded5-4b14-8a86-89c3ce3e33df","Type":"ContainerDied","Data":"abfa15f6de4daed047e18e5a602cd0577d104072963eda4b67a1d006df7fb930"} Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.262398 4790 scope.go:117] "RemoveContainer" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.301435 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.311038 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-cpxlj"] Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.321320 4790 scope.go:117] "RemoveContainer" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.369103 4790 scope.go:117] "RemoveContainer" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.403090 4790 scope.go:117] "RemoveContainer" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" Mar 13 21:23:44 crc kubenswrapper[4790]: E0313 21:23:44.403568 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5\": container with ID starting with 1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5 not found: ID does not exist" containerID="1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.403617 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5"} err="failed to get container status \"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5\": rpc error: code = NotFound desc = could not find container \"1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5\": container with ID starting with 1381f4d18bedbd1510d9c9c976eff9fd4f8533d22a41f3fd5b6abb7f3eb6b7c5 not found: ID does not exist" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.403642 4790 scope.go:117] "RemoveContainer" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" Mar 13 21:23:44 crc kubenswrapper[4790]: E0313 21:23:44.403997 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2\": container with ID starting with abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2 not found: ID does not exist" containerID="abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.404036 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2"} err="failed to get container status \"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2\": rpc error: code = NotFound desc = could not find container \"abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2\": container with ID starting with abcbcd3722b56b4963eb3f3570a8fdd447fe3802f521a3be266ce0b4fb838ac2 not found: ID does not exist" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.404090 4790 scope.go:117] "RemoveContainer" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" Mar 13 21:23:44 crc kubenswrapper[4790]: E0313 21:23:44.404343 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc\": container with ID starting with 425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc not found: ID does not exist" containerID="425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc" Mar 13 21:23:44 crc kubenswrapper[4790]: I0313 21:23:44.404368 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc"} err="failed to get container status \"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc\": rpc error: code = NotFound desc = could not find container \"425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc\": container with ID starting with 425aaeba7e7d5553bbf3404989d75b923a49be0649aab9f4f7747d04b2f856fc not found: ID does not exist" Mar 13 21:23:45 crc kubenswrapper[4790]: I0313 21:23:45.671003 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" path="/var/lib/kubelet/pods/62b23203-ded5-4b14-8a86-89c3ce3e33df/volumes" Mar 13 21:23:47 crc kubenswrapper[4790]: I0313 21:23:47.289015 4790 generic.go:334] "Generic (PLEG): container finished" podID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerID="5f3dc8212dd652060ecb9c9d45ce324d2168353ccf633608e4415a58fb8949f8" exitCode=0 Mar 13 21:23:47 crc kubenswrapper[4790]: I0313 21:23:47.289129 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerDied","Data":"5f3dc8212dd652060ecb9c9d45ce324d2168353ccf633608e4415a58fb8949f8"} Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.628959 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.693767 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.693870 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.693943 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694007 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694038 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694103 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694196 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694238 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.694327 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") pod \"50c1f858-4451-4e6e-9e80-6e37528305a2\" (UID: \"50c1f858-4451-4e6e-9e80-6e37528305a2\") " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.702094 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.702488 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr" (OuterVolumeSpecName: "kube-api-access-pbzlr") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "kube-api-access-pbzlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.702722 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.703479 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data" (OuterVolumeSpecName: "config-data") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.708579 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "test-operator-logs") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.726774 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.740647 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.741506 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.750245 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "50c1f858-4451-4e6e-9e80-6e37528305a2" (UID: "50c1f858-4451-4e6e-9e80-6e37528305a2"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800030 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800074 4790 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800091 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800104 4790 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800117 4790 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/50c1f858-4451-4e6e-9e80-6e37528305a2-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800141 4790 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800154 4790 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/50c1f858-4451-4e6e-9e80-6e37528305a2-config-data\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800166 4790 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/50c1f858-4451-4e6e-9e80-6e37528305a2-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.800179 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbzlr\" (UniqueName: \"kubernetes.io/projected/50c1f858-4451-4e6e-9e80-6e37528305a2-kube-api-access-pbzlr\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.820371 4790 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Mar 13 21:23:48 crc kubenswrapper[4790]: I0313 21:23:48.902355 4790 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Mar 13 21:23:49 crc kubenswrapper[4790]: I0313 21:23:49.317762 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"50c1f858-4451-4e6e-9e80-6e37528305a2","Type":"ContainerDied","Data":"9ae048789cc06b95d8d9a690f59586791ebaca094ac82840d0dd227be9680876"} Mar 13 21:23:49 crc kubenswrapper[4790]: I0313 21:23:49.317824 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ae048789cc06b95d8d9a690f59586791ebaca094ac82840d0dd227be9680876" Mar 13 21:23:49 crc kubenswrapper[4790]: I0313 21:23:49.317911 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.170654 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173076 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-content" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173192 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-content" Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173330 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173445 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173547 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerName="tempest-tests-tempest-tests-runner" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173644 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerName="tempest-tests-tempest-tests-runner" Mar 13 21:24:00 crc kubenswrapper[4790]: E0313 21:24:00.173738 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-utilities" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.173824 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="extract-utilities" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.174184 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="62b23203-ded5-4b14-8a86-89c3ce3e33df" containerName="registry-server" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.174300 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="50c1f858-4451-4e6e-9e80-6e37528305a2" containerName="tempest-tests-tempest-tests-runner" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.175351 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.178296 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.178423 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.179002 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.181185 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.262555 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"auto-csr-approver-29557284-fszb4\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.364782 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"auto-csr-approver-29557284-fszb4\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.389683 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"auto-csr-approver-29557284-fszb4\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.498677 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:00 crc kubenswrapper[4790]: I0313 21:24:00.944725 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:24:01 crc kubenswrapper[4790]: I0313 21:24:01.422573 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerStarted","Data":"271547d8b5572b272e3875c751e4a0cfb77044bb630cfbf410e2540184ce24db"} Mar 13 21:24:02 crc kubenswrapper[4790]: I0313 21:24:02.431366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerStarted","Data":"b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3"} Mar 13 21:24:02 crc kubenswrapper[4790]: I0313 21:24:02.453371 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557284-fszb4" podStartSLOduration=1.396279386 podStartE2EDuration="2.453353946s" podCreationTimestamp="2026-03-13 21:24:00 +0000 UTC" firstStartedPulling="2026-03-13 21:24:00.953002897 +0000 UTC m=+3371.974118788" lastFinishedPulling="2026-03-13 21:24:02.010077457 +0000 UTC m=+3373.031193348" observedRunningTime="2026-03-13 21:24:02.448108343 +0000 UTC m=+3373.469224234" watchObservedRunningTime="2026-03-13 21:24:02.453353946 +0000 UTC m=+3373.474469837" Mar 13 21:24:03 crc kubenswrapper[4790]: I0313 21:24:03.442647 4790 generic.go:334] "Generic (PLEG): container finished" podID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerID="b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3" exitCode=0 Mar 13 21:24:03 crc kubenswrapper[4790]: I0313 21:24:03.442730 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerDied","Data":"b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3"} Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.797164 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.879808 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") pod \"e71263f0-7309-4046-b71d-2ae38e13d27c\" (UID: \"e71263f0-7309-4046-b71d-2ae38e13d27c\") " Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.893192 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp" (OuterVolumeSpecName: "kube-api-access-snbfp") pod "e71263f0-7309-4046-b71d-2ae38e13d27c" (UID: "e71263f0-7309-4046-b71d-2ae38e13d27c"). InnerVolumeSpecName "kube-api-access-snbfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:24:04 crc kubenswrapper[4790]: I0313 21:24:04.982714 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snbfp\" (UniqueName: \"kubernetes.io/projected/e71263f0-7309-4046-b71d-2ae38e13d27c-kube-api-access-snbfp\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.459151 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557284-fszb4" event={"ID":"e71263f0-7309-4046-b71d-2ae38e13d27c","Type":"ContainerDied","Data":"271547d8b5572b272e3875c751e4a0cfb77044bb630cfbf410e2540184ce24db"} Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.459201 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557284-fszb4" Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.459212 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="271547d8b5572b272e3875c751e4a0cfb77044bb630cfbf410e2540184ce24db" Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.528421 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.538117 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557278-6n5t9"] Mar 13 21:24:05 crc kubenswrapper[4790]: I0313 21:24:05.669912 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d317138-8c0e-4824-a6b0-c25bb9b79631" path="/var/lib/kubelet/pods/4d317138-8c0e-4824-a6b0-c25bb9b79631/volumes" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.317898 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:06 crc kubenswrapper[4790]: E0313 21:24:06.318991 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerName="oc" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.319024 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerName="oc" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.319222 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" containerName="oc" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.321042 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.328665 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.408598 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.408769 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.408848 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.511505 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.511635 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.511701 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.512504 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.512715 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.532684 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"redhat-operators-smm9t\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:06 crc kubenswrapper[4790]: I0313 21:24:06.643309 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.107316 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.487800 4790 generic.go:334] "Generic (PLEG): container finished" podID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" exitCode=0 Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.488083 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd"} Mar 13 21:24:07 crc kubenswrapper[4790]: I0313 21:24:07.488112 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerStarted","Data":"a27118c7da3719bfb8ec9d74d4e9c7353a1e0f88365aff82b5f056ddcc2d1492"} Mar 13 21:24:09 crc kubenswrapper[4790]: I0313 21:24:09.509319 4790 generic.go:334] "Generic (PLEG): container finished" podID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" exitCode=0 Mar 13 21:24:09 crc kubenswrapper[4790]: I0313 21:24:09.509407 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f"} Mar 13 21:24:10 crc kubenswrapper[4790]: I0313 21:24:10.520803 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerStarted","Data":"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55"} Mar 13 21:24:10 crc kubenswrapper[4790]: I0313 21:24:10.541953 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smm9t" podStartSLOduration=1.991387137 podStartE2EDuration="4.541922697s" podCreationTimestamp="2026-03-13 21:24:06 +0000 UTC" firstStartedPulling="2026-03-13 21:24:07.502830943 +0000 UTC m=+3378.523946834" lastFinishedPulling="2026-03-13 21:24:10.053366463 +0000 UTC m=+3381.074482394" observedRunningTime="2026-03-13 21:24:10.537178876 +0000 UTC m=+3381.558294777" watchObservedRunningTime="2026-03-13 21:24:10.541922697 +0000 UTC m=+3381.563038588" Mar 13 21:24:16 crc kubenswrapper[4790]: I0313 21:24:16.643642 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:16 crc kubenswrapper[4790]: I0313 21:24:16.652583 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:17 crc kubenswrapper[4790]: I0313 21:24:17.694046 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smm9t" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" probeResult="failure" output=< Mar 13 21:24:17 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:24:17 crc kubenswrapper[4790]: > Mar 13 21:24:19 crc kubenswrapper[4790]: I0313 21:24:19.414260 4790 scope.go:117] "RemoveContainer" containerID="9bdff7a81ff2a9a8995b79476629c9294b76419c09baf5ddb2aac9365620522e" Mar 13 21:24:26 crc kubenswrapper[4790]: I0313 21:24:26.691932 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:26 crc kubenswrapper[4790]: I0313 21:24:26.742262 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:26 crc kubenswrapper[4790]: I0313 21:24:26.928150 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:28 crc kubenswrapper[4790]: I0313 21:24:28.710294 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smm9t" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" containerID="cri-o://c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" gracePeriod=2 Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.133723 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269096 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") pod \"e47c235d-1228-4ded-9bc0-5dc34e05572f\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269229 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") pod \"e47c235d-1228-4ded-9bc0-5dc34e05572f\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269334 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") pod \"e47c235d-1228-4ded-9bc0-5dc34e05572f\" (UID: \"e47c235d-1228-4ded-9bc0-5dc34e05572f\") " Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.269998 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities" (OuterVolumeSpecName: "utilities") pod "e47c235d-1228-4ded-9bc0-5dc34e05572f" (UID: "e47c235d-1228-4ded-9bc0-5dc34e05572f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.275123 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz" (OuterVolumeSpecName: "kube-api-access-4vqcz") pod "e47c235d-1228-4ded-9bc0-5dc34e05572f" (UID: "e47c235d-1228-4ded-9bc0-5dc34e05572f"). InnerVolumeSpecName "kube-api-access-4vqcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.372131 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.372621 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vqcz\" (UniqueName: \"kubernetes.io/projected/e47c235d-1228-4ded-9bc0-5dc34e05572f-kube-api-access-4vqcz\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.394688 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e47c235d-1228-4ded-9bc0-5dc34e05572f" (UID: "e47c235d-1228-4ded-9bc0-5dc34e05572f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.474028 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e47c235d-1228-4ded-9bc0-5dc34e05572f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722627 4790 generic.go:334] "Generic (PLEG): container finished" podID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" exitCode=0 Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722677 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55"} Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722718 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smm9t" event={"ID":"e47c235d-1228-4ded-9bc0-5dc34e05572f","Type":"ContainerDied","Data":"a27118c7da3719bfb8ec9d74d4e9c7353a1e0f88365aff82b5f056ddcc2d1492"} Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722741 4790 scope.go:117] "RemoveContainer" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.722774 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smm9t" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.751764 4790 scope.go:117] "RemoveContainer" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.762291 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.772994 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smm9t"] Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.775298 4790 scope.go:117] "RemoveContainer" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.826361 4790 scope.go:117] "RemoveContainer" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" Mar 13 21:24:29 crc kubenswrapper[4790]: E0313 21:24:29.826989 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55\": container with ID starting with c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55 not found: ID does not exist" containerID="c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827032 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55"} err="failed to get container status \"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55\": rpc error: code = NotFound desc = could not find container \"c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55\": container with ID starting with c8286688b734c827590cf715cef9428f4b5d6ef1290ed571ded9d1010d874f55 not found: ID does not exist" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827059 4790 scope.go:117] "RemoveContainer" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" Mar 13 21:24:29 crc kubenswrapper[4790]: E0313 21:24:29.827413 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f\": container with ID starting with b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f not found: ID does not exist" containerID="b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827436 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f"} err="failed to get container status \"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f\": rpc error: code = NotFound desc = could not find container \"b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f\": container with ID starting with b67f8916aa424359681bcf55b86d8ee6c5e2c8e33dbc7f8cd544914033f54d0f not found: ID does not exist" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.827448 4790 scope.go:117] "RemoveContainer" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" Mar 13 21:24:29 crc kubenswrapper[4790]: E0313 21:24:29.829561 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd\": container with ID starting with 5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd not found: ID does not exist" containerID="5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd" Mar 13 21:24:29 crc kubenswrapper[4790]: I0313 21:24:29.829631 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd"} err="failed to get container status \"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd\": rpc error: code = NotFound desc = could not find container \"5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd\": container with ID starting with 5ff6aad9a74d3b9806cab406ced1dbfc57c706a914e5ccd3e84b344123c800fd not found: ID does not exist" Mar 13 21:24:31 crc kubenswrapper[4790]: I0313 21:24:31.672718 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" path="/var/lib/kubelet/pods/e47c235d-1228-4ded-9bc0-5dc34e05572f/volumes" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.466289 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:24:37 crc kubenswrapper[4790]: E0313 21:24:37.467222 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467237 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" Mar 13 21:24:37 crc kubenswrapper[4790]: E0313 21:24:37.467271 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-utilities" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467280 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-utilities" Mar 13 21:24:37 crc kubenswrapper[4790]: E0313 21:24:37.467297 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-content" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467305 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="extract-content" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.467498 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="e47c235d-1228-4ded-9bc0-5dc34e05572f" containerName="registry-server" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.468524 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.470108 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d6kfv"/"default-dockercfg-j27lm" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.470428 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6kfv"/"kube-root-ca.crt" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.470593 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6kfv"/"openshift-service-ca.crt" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.478903 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.544824 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.544911 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.646688 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.646781 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.647557 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.675327 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"must-gather-qf7z2\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:37 crc kubenswrapper[4790]: I0313 21:24:37.790132 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:24:38 crc kubenswrapper[4790]: I0313 21:24:38.246362 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:24:38 crc kubenswrapper[4790]: I0313 21:24:38.796278 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerStarted","Data":"bdf221b7afb6aaa5a6f44570f4a6373a7ad59c372a30ae5b7c3598adb658e3ef"} Mar 13 21:24:46 crc kubenswrapper[4790]: I0313 21:24:46.903850 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerStarted","Data":"f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb"} Mar 13 21:24:46 crc kubenswrapper[4790]: I0313 21:24:46.904414 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerStarted","Data":"3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8"} Mar 13 21:24:46 crc kubenswrapper[4790]: I0313 21:24:46.923519 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" podStartSLOduration=2.442272951 podStartE2EDuration="9.923504477s" podCreationTimestamp="2026-03-13 21:24:37 +0000 UTC" firstStartedPulling="2026-03-13 21:24:38.255899621 +0000 UTC m=+3409.277015512" lastFinishedPulling="2026-03-13 21:24:45.737131147 +0000 UTC m=+3416.758247038" observedRunningTime="2026-03-13 21:24:46.922423867 +0000 UTC m=+3417.943539758" watchObservedRunningTime="2026-03-13 21:24:46.923504477 +0000 UTC m=+3417.944620368" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.566946 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-w42ql"] Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.569006 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.699264 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.700602 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.802597 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.802656 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.803055 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.821110 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"crc-debug-w42ql\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:49 crc kubenswrapper[4790]: I0313 21:24:49.892835 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:24:50 crc kubenswrapper[4790]: I0313 21:24:50.941521 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" event={"ID":"cedbf666-53d4-4452-849e-8714a7e57e3c","Type":"ContainerStarted","Data":"fc567be766ecf269be33edfb55f52a19879095d8f295cb35981738d3c130d459"} Mar 13 21:25:04 crc kubenswrapper[4790]: I0313 21:25:04.080333 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" event={"ID":"cedbf666-53d4-4452-849e-8714a7e57e3c","Type":"ContainerStarted","Data":"18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0"} Mar 13 21:25:04 crc kubenswrapper[4790]: I0313 21:25:04.097530 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" podStartSLOduration=1.446822304 podStartE2EDuration="15.097508356s" podCreationTimestamp="2026-03-13 21:24:49 +0000 UTC" firstStartedPulling="2026-03-13 21:24:49.932539536 +0000 UTC m=+3420.953655427" lastFinishedPulling="2026-03-13 21:25:03.583225588 +0000 UTC m=+3434.604341479" observedRunningTime="2026-03-13 21:25:04.094661918 +0000 UTC m=+3435.115777809" watchObservedRunningTime="2026-03-13 21:25:04.097508356 +0000 UTC m=+3435.118624257" Mar 13 21:25:14 crc kubenswrapper[4790]: I0313 21:25:14.016206 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:25:14 crc kubenswrapper[4790]: I0313 21:25:14.016823 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:25:42 crc kubenswrapper[4790]: I0313 21:25:42.419832 4790 generic.go:334] "Generic (PLEG): container finished" podID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerID="18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0" exitCode=0 Mar 13 21:25:42 crc kubenswrapper[4790]: I0313 21:25:42.419909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" event={"ID":"cedbf666-53d4-4452-849e-8714a7e57e3c","Type":"ContainerDied","Data":"18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0"} Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.518818 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.550660 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-w42ql"] Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.560241 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-w42ql"] Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.668726 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") pod \"cedbf666-53d4-4452-849e-8714a7e57e3c\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.668957 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") pod \"cedbf666-53d4-4452-849e-8714a7e57e3c\" (UID: \"cedbf666-53d4-4452-849e-8714a7e57e3c\") " Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.670523 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host" (OuterVolumeSpecName: "host") pod "cedbf666-53d4-4452-849e-8714a7e57e3c" (UID: "cedbf666-53d4-4452-849e-8714a7e57e3c"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.674762 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk" (OuterVolumeSpecName: "kube-api-access-5vgjk") pod "cedbf666-53d4-4452-849e-8714a7e57e3c" (UID: "cedbf666-53d4-4452-849e-8714a7e57e3c"). InnerVolumeSpecName "kube-api-access-5vgjk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.772195 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5vgjk\" (UniqueName: \"kubernetes.io/projected/cedbf666-53d4-4452-849e-8714a7e57e3c-kube-api-access-5vgjk\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:43 crc kubenswrapper[4790]: I0313 21:25:43.772222 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cedbf666-53d4-4452-849e-8714a7e57e3c-host\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.016001 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.016053 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.436534 4790 scope.go:117] "RemoveContainer" containerID="18f925d7502ea72468acd4166123c68adbb9468a2d3d0cc2e7b5e323792b34d0" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.436576 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-w42ql" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.707355 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-22ftk"] Mar 13 21:25:44 crc kubenswrapper[4790]: E0313 21:25:44.708035 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerName="container-00" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.708049 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerName="container-00" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.708220 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" containerName="container-00" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.708826 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.793257 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.793513 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.895545 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.895611 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.895817 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:44 crc kubenswrapper[4790]: I0313 21:25:44.912361 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"crc-debug-22ftk\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.027124 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.447328 4790 generic.go:334] "Generic (PLEG): container finished" podID="f364e64b-c894-4978-8625-4f4680ad09f1" containerID="19ec3b81cfc93adcffb8135210e1ea8d379fb945e3eddc6ee978b60b4ce52405" exitCode=0 Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.447527 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" event={"ID":"f364e64b-c894-4978-8625-4f4680ad09f1","Type":"ContainerDied","Data":"19ec3b81cfc93adcffb8135210e1ea8d379fb945e3eddc6ee978b60b4ce52405"} Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.447687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" event={"ID":"f364e64b-c894-4978-8625-4f4680ad09f1","Type":"ContainerStarted","Data":"360e4c0b8317039594325f893b5c1fb108d16716b89b77acd4c50141cbd8cdc9"} Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.669724 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cedbf666-53d4-4452-849e-8714a7e57e3c" path="/var/lib/kubelet/pods/cedbf666-53d4-4452-849e-8714a7e57e3c/volumes" Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.877676 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-22ftk"] Mar 13 21:25:45 crc kubenswrapper[4790]: I0313 21:25:45.889055 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-22ftk"] Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.556918 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625232 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") pod \"f364e64b-c894-4978-8625-4f4680ad09f1\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625302 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") pod \"f364e64b-c894-4978-8625-4f4680ad09f1\" (UID: \"f364e64b-c894-4978-8625-4f4680ad09f1\") " Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625635 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host" (OuterVolumeSpecName: "host") pod "f364e64b-c894-4978-8625-4f4680ad09f1" (UID: "f364e64b-c894-4978-8625-4f4680ad09f1"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.625985 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f364e64b-c894-4978-8625-4f4680ad09f1-host\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.630638 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h" (OuterVolumeSpecName: "kube-api-access-4g85h") pod "f364e64b-c894-4978-8625-4f4680ad09f1" (UID: "f364e64b-c894-4978-8625-4f4680ad09f1"). InnerVolumeSpecName "kube-api-access-4g85h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:25:46 crc kubenswrapper[4790]: I0313 21:25:46.727599 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g85h\" (UniqueName: \"kubernetes.io/projected/f364e64b-c894-4978-8625-4f4680ad09f1-kube-api-access-4g85h\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.051551 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-8tm75"] Mar 13 21:25:47 crc kubenswrapper[4790]: E0313 21:25:47.053053 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" containerName="container-00" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.053099 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" containerName="container-00" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.053283 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" containerName="container-00" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.053961 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.134747 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.135199 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.236609 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.236721 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.236957 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.256338 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"crc-debug-8tm75\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.370495 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.466622 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="360e4c0b8317039594325f893b5c1fb108d16716b89b77acd4c50141cbd8cdc9" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.466664 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-22ftk" Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.467958 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" event={"ID":"6ea26912-deee-4ff3-ad55-a6d703f99865","Type":"ContainerStarted","Data":"40c1c24f22a918346bd6c67a469263cd47d726ef2ba65c9cee278982982158da"} Mar 13 21:25:47 crc kubenswrapper[4790]: I0313 21:25:47.696107 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f364e64b-c894-4978-8625-4f4680ad09f1" path="/var/lib/kubelet/pods/f364e64b-c894-4978-8625-4f4680ad09f1/volumes" Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.477517 4790 generic.go:334] "Generic (PLEG): container finished" podID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerID="4dfa4dd75f80f12e10343292ad55bf7e12db50998c94a2d0043e7456c83d4512" exitCode=0 Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.477583 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" event={"ID":"6ea26912-deee-4ff3-ad55-a6d703f99865","Type":"ContainerDied","Data":"4dfa4dd75f80f12e10343292ad55bf7e12db50998c94a2d0043e7456c83d4512"} Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.515362 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-8tm75"] Mar 13 21:25:48 crc kubenswrapper[4790]: I0313 21:25:48.525494 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/crc-debug-8tm75"] Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.604018 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.686125 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") pod \"6ea26912-deee-4ff3-ad55-a6d703f99865\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.686244 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") pod \"6ea26912-deee-4ff3-ad55-a6d703f99865\" (UID: \"6ea26912-deee-4ff3-ad55-a6d703f99865\") " Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.686570 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host" (OuterVolumeSpecName: "host") pod "6ea26912-deee-4ff3-ad55-a6d703f99865" (UID: "6ea26912-deee-4ff3-ad55-a6d703f99865"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.687113 4790 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6ea26912-deee-4ff3-ad55-a6d703f99865-host\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.693856 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl" (OuterVolumeSpecName: "kube-api-access-z6qbl") pod "6ea26912-deee-4ff3-ad55-a6d703f99865" (UID: "6ea26912-deee-4ff3-ad55-a6d703f99865"). InnerVolumeSpecName "kube-api-access-z6qbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:25:49 crc kubenswrapper[4790]: I0313 21:25:49.789530 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6qbl\" (UniqueName: \"kubernetes.io/projected/6ea26912-deee-4ff3-ad55-a6d703f99865-kube-api-access-z6qbl\") on node \"crc\" DevicePath \"\"" Mar 13 21:25:50 crc kubenswrapper[4790]: I0313 21:25:50.493799 4790 scope.go:117] "RemoveContainer" containerID="4dfa4dd75f80f12e10343292ad55bf7e12db50998c94a2d0043e7456c83d4512" Mar 13 21:25:50 crc kubenswrapper[4790]: I0313 21:25:50.493826 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/crc-debug-8tm75" Mar 13 21:25:50 crc kubenswrapper[4790]: E0313 21:25:50.591886 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea26912_deee_4ff3_ad55_a6d703f99865.slice/crio-40c1c24f22a918346bd6c67a469263cd47d726ef2ba65c9cee278982982158da\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ea26912_deee_4ff3_ad55_a6d703f99865.slice\": RecentStats: unable to find data in memory cache]" Mar 13 21:25:51 crc kubenswrapper[4790]: I0313 21:25:51.671795 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" path="/var/lib/kubelet/pods/6ea26912-deee-4ff3-ad55-a6d703f99865/volumes" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.151819 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:26:00 crc kubenswrapper[4790]: E0313 21:26:00.152854 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerName="container-00" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.152871 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerName="container-00" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.153079 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ea26912-deee-4ff3-ad55-a6d703f99865" containerName="container-00" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.153845 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.156540 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.156566 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.157870 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.164880 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.289160 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"auto-csr-approver-29557286-j2hgs\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.390553 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"auto-csr-approver-29557286-j2hgs\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.424907 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"auto-csr-approver-29557286-j2hgs\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.477415 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:00 crc kubenswrapper[4790]: I0313 21:26:00.931308 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:26:01 crc kubenswrapper[4790]: I0313 21:26:01.659210 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" event={"ID":"93d597e8-cf42-4f34-a6c1-ffe9416a562b","Type":"ContainerStarted","Data":"ce940e262f90be5fa54ef8e07bd777275cab1626fc349844b9170a8412018fb7"} Mar 13 21:26:02 crc kubenswrapper[4790]: I0313 21:26:02.669607 4790 generic.go:334] "Generic (PLEG): container finished" podID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerID="d255e7ab1f308e1f21736aa4f57843906cd9283c436db74b17e9a79b7ff4810a" exitCode=0 Mar 13 21:26:02 crc kubenswrapper[4790]: I0313 21:26:02.669733 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" event={"ID":"93d597e8-cf42-4f34-a6c1-ffe9416a562b","Type":"ContainerDied","Data":"d255e7ab1f308e1f21736aa4f57843906cd9283c436db74b17e9a79b7ff4810a"} Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.098968 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.168531 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") pod \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\" (UID: \"93d597e8-cf42-4f34-a6c1-ffe9416a562b\") " Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.175355 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs" (OuterVolumeSpecName: "kube-api-access-f7tqs") pod "93d597e8-cf42-4f34-a6c1-ffe9416a562b" (UID: "93d597e8-cf42-4f34-a6c1-ffe9416a562b"). InnerVolumeSpecName "kube-api-access-f7tqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.271159 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7tqs\" (UniqueName: \"kubernetes.io/projected/93d597e8-cf42-4f34-a6c1-ffe9416a562b-kube-api-access-f7tqs\") on node \"crc\" DevicePath \"\"" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.333677 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c6887dbdb-wnl4x_dc5e5f2f-999a-4ae6-82f1-d5942a570a3e/barbican-api/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.486650 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f469b5d-gs7bt_8a191811-ef81-4066-bcbb-0385c9258fc0/barbican-keystone-listener/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.534013 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7c6887dbdb-wnl4x_dc5e5f2f-999a-4ae6-82f1-d5942a570a3e/barbican-api-log/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.550799 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-798f469b5d-gs7bt_8a191811-ef81-4066-bcbb-0385c9258fc0/barbican-keystone-listener-log/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.687283 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" event={"ID":"93d597e8-cf42-4f34-a6c1-ffe9416a562b","Type":"ContainerDied","Data":"ce940e262f90be5fa54ef8e07bd777275cab1626fc349844b9170a8412018fb7"} Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.687321 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557286-j2hgs" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.687327 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ce940e262f90be5fa54ef8e07bd777275cab1626fc349844b9170a8412018fb7" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.776314 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d9ddc9bbc-tg88r_98f92730-30b3-4583-ab7c-258c0a0880a2/barbican-worker/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.799159 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d9ddc9bbc-tg88r_98f92730-30b3-4583-ab7c-258c0a0880a2/barbican-worker-log/0.log" Mar 13 21:26:04 crc kubenswrapper[4790]: I0313 21:26:04.997347 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-ntq8n_5fc3181b-a2df-4d5c-afa1-057cef46dd95/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.077892 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/ceilometer-central-agent/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.133974 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/ceilometer-notification-agent/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.173340 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.184034 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557280-f6wtq"] Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.308338 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/proxy-httpd/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.350174 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_d2645f50-482e-487d-9b16-c2a066630480/sg-core/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.379661 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c42a2a27-f7c5-463b-982a-4dafcac978ad/cinder-api/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.563526 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_c42a2a27-f7c5-463b-982a-4dafcac978ad/cinder-api-log/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.622193 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5/cinder-scheduler/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.756806 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c215866d-1e07-4033-8e8a-d7826692bc76" path="/var/lib/kubelet/pods/c215866d-1e07-4033-8e8a-d7826692bc76/volumes" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.786109 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_3ccdc6f2-f911-48c1-b8a8-dc6f2054fed5/probe/0.log" Mar 13 21:26:05 crc kubenswrapper[4790]: I0313 21:26:05.947489 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-vg564_c1609d29-96e5-43eb-a086-5587ca7c4f5a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.063150 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-fb2tk_f7e18dc0-dbbb-419e-bdad-22b5f08ffa6f/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.180532 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p5ml2_66175627-2b03-49c6-a7a1-de69f8851d9a/init/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.378133 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p5ml2_66175627-2b03-49c6-a7a1-de69f8851d9a/init/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.391587 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-55478c4467-p5ml2_66175627-2b03-49c6-a7a1-de69f8851d9a/dnsmasq-dns/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.499837 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-cfp58_304addb4-f579-42f8-87d8-8e15b713aef2/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.626486 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8c1c1847-eb77-4170-8034-e58ba375ad84/glance-httpd/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.633054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_8c1c1847-eb77-4170-8034-e58ba375ad84/glance-log/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.826900 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5b10e44-e0ce-4568-b33c-dd9855d61fd7/glance-httpd/0.log" Mar 13 21:26:06 crc kubenswrapper[4790]: I0313 21:26:06.844625 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_b5b10e44-e0ce-4568-b33c-dd9855d61fd7/glance-log/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.107090 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-686b857b8-6fghv_d0f5105d-51ea-4e5e-832f-8302188a943a/horizon/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.209953 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-mznb5_77bc94c9-b530-4ea9-8c94-0d5a985fb930/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.367686 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-686b857b8-6fghv_d0f5105d-51ea-4e5e-832f-8302188a943a/horizon-log/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.392347 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-mjj4b_04553c47-94a9-465f-a241-9188784794de/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.745717 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29557261-5pp9q_65a6ecdc-c1c2-4cb1-b7c5-64f500aa9648/keystone-cron/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.787027 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-c5788df58-llnz4_4c3cfa50-a4b5-45e0-9cb4-d6a5495f4fb7/keystone-api/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.960503 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_2ae1ef11-086d-4d65-bfcb-987f3973fdc5/kube-state-metrics/0.log" Mar 13 21:26:07 crc kubenswrapper[4790]: I0313 21:26:07.984392 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-dsxjx_c70cf667-ebdd-414d-be40-62d26209abcf/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:08 crc kubenswrapper[4790]: I0313 21:26:08.297672 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77f687ff4f-d7b7z_6c6f5d56-217d-441e-8771-503fd5e681fb/neutron-api/0.log" Mar 13 21:26:08 crc kubenswrapper[4790]: I0313 21:26:08.377111 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-77f687ff4f-d7b7z_6c6f5d56-217d-441e-8771-503fd5e681fb/neutron-httpd/0.log" Mar 13 21:26:08 crc kubenswrapper[4790]: I0313 21:26:08.559877 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-7hpx4_944a84ad-4d2b-4d1b-ae69-a4f861e7d3c0/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.136890 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4597d91c-0f9f-4e33-aaa7-b25e7076e13a/nova-api-log/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.450279 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_253ef3a1-1764-4120-a5f8-db908a0e7fd4/nova-cell0-conductor-conductor/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.499233 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_0c78f8ec-a8d8-43e0-b650-b9e1cf1d669b/nova-cell1-conductor-conductor/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.557977 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4597d91c-0f9f-4e33-aaa7-b25e7076e13a/nova-api-api/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.789395 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_20c0842a-c69a-4af0-aef0-ffec3f3560bc/nova-cell1-novncproxy-novncproxy/0.log" Mar 13 21:26:09 crc kubenswrapper[4790]: I0313 21:26:09.890126 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-lhxzv_7b947c94-305d-453d-b2f0-bcf3c84467b3/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.048821 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_00b43558-bdf4-45e4-b1bc-6e9b325e163b/nova-metadata-log/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.286339 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fa2face0-9349-4482-880a-b23cf41099b2/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.308585 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_01e86425-f126-4827-b727-e8c73d152aa6/nova-scheduler-scheduler/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.363251 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_00b43558-bdf4-45e4-b1bc-6e9b325e163b/nova-metadata-metadata/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.432055 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fa2face0-9349-4482-880a-b23cf41099b2/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.470482 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_fa2face0-9349-4482-880a-b23cf41099b2/galera/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.582176 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fceb0829-5f0e-4e78-a803-61afc5aa4d60/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.785086 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fceb0829-5f0e-4e78-a803-61afc5aa4d60/galera/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.809523 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_fceb0829-5f0e-4e78-a803-61afc5aa4d60/mysql-bootstrap/0.log" Mar 13 21:26:10 crc kubenswrapper[4790]: I0313 21:26:10.825446 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7f0237c2-5c72-4776-9226-67244abca8dd/openstackclient/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.030050 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-nrv7g_dfb0e0ca-d164-4e22-9d3f-055a45a372d2/openstack-network-exporter/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.102604 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovsdb-server-init/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.341225 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovs-vswitchd/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.356949 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovsdb-server/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.361504 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-k7bzr_8c2d7175-fc2b-4492-ac1c-e2cc3dd44c58/ovsdb-server-init/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.535802 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-vspq5_c72ac557-7882-4120-b64a-4343639cc766/ovn-controller/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.623289 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-7d9cq_fe27e2d5-7108-4d49-99bb-15208f36cff7/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.843116 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_18e18c94-0ce6-4578-a224-384826512a34/ovn-northd/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.870884 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_18e18c94-0ce6-4578-a224-384826512a34/openstack-network-exporter/0.log" Mar 13 21:26:11 crc kubenswrapper[4790]: I0313 21:26:11.953235 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f5a24d7e-902f-4862-9c6b-8317f8fb3f29/openstack-network-exporter/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.112133 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_f5a24d7e-902f-4862-9c6b-8317f8fb3f29/ovsdbserver-nb/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.114762 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ba4867dc-70fb-4533-a075-31fc03f7ef33/openstack-network-exporter/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.147228 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_ba4867dc-70fb-4533-a075-31fc03f7ef33/ovsdbserver-sb/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.362485 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-854ddc4bd-b4ws7_b14c1738-5e9e-4810-b926-5b05af9ec22d/placement-api/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.422847 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-854ddc4bd-b4ws7_b14c1738-5e9e-4810-b926-5b05af9ec22d/placement-log/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.532952 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9/setup-container/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.729235 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9/setup-container/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.844170 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_4ac7c2bb-fa6a-437a-9af3-d4ffa930bdf9/rabbitmq/0.log" Mar 13 21:26:12 crc kubenswrapper[4790]: I0313 21:26:12.881850 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72ed8a4f-a46a-4e41-9335-f10dc6338627/setup-container/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.002085 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72ed8a4f-a46a-4e41-9335-f10dc6338627/setup-container/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.090875 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-gqbmt_7cb0d614-f5d9-4862-8059-ad323eec6c59/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.100080 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_72ed8a4f-a46a-4e41-9335-f10dc6338627/rabbitmq/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.298556 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zgxl2_6383acac-fad0-45d2-8263-da2ceb0b9e83/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.345274 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-hzb9h_37459d15-1599-492b-8710-7723829a096d/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.509625 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-rhtx8_d19bd67c-441b-4813-8cc3-07c8cf446e42/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.627457 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-pdxrj_ee77aab6-b3c2-4925-a715-428a4c5e5bd9/ssh-known-hosts-edpm-deployment/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.848325 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-798495789f-5fvw5_7d498924-f84f-48aa-b971-b58cbea48295/proxy-server/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.909368 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-798495789f-5fvw5_7d498924-f84f-48aa-b971-b58cbea48295/proxy-httpd/0.log" Mar 13 21:26:13 crc kubenswrapper[4790]: I0313 21:26:13.980124 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-dv686_b4ea3695-dddc-48fe-bdb6-eb0450c697c4/swift-ring-rebalance/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.015209 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.015275 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.015327 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.016219 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.016292 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7" gracePeriod=600 Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.119542 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-reaper/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.206950 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-auditor/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.221281 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-replicator/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.294452 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/account-server/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.392136 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-auditor/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.419020 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-replicator/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.442436 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-server/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.573164 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-auditor/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.599972 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/container-updater/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.613821 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-expirer/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.695918 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-replicator/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.773821 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-server/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.858627 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/rsync/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887405 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7" exitCode=0 Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887464 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7"} Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887498 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80"} Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.887520 4790 scope.go:117] "RemoveContainer" containerID="e6f6929c77e4c390cf78a1e8890b6730b0ae129ede203953166488821564fb36" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.935960 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/object-updater/0.log" Mar 13 21:26:14 crc kubenswrapper[4790]: I0313 21:26:14.960972 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_529b41ec-f1ee-432c-ac41-6957e1809aaa/swift-recon-cron/0.log" Mar 13 21:26:15 crc kubenswrapper[4790]: I0313 21:26:15.167539 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-lkcwr_71b17a66-faf5-4379-ace9-a4fff12cac5b/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:15 crc kubenswrapper[4790]: I0313 21:26:15.212624 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_50c1f858-4451-4e6e-9e80-6e37528305a2/tempest-tests-tempest-tests-runner/0.log" Mar 13 21:26:15 crc kubenswrapper[4790]: I0313 21:26:15.366971 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-n54ff_20beb5d9-49e6-47c7-a3ad-107ff79e56fd/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 13 21:26:19 crc kubenswrapper[4790]: I0313 21:26:19.545035 4790 scope.go:117] "RemoveContainer" containerID="1dfb1a39dcbf9770c39e6abee624c19e7caa14a0b69762f480ec12e76586b37f" Mar 13 21:26:23 crc kubenswrapper[4790]: I0313 21:26:23.724774 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_3980f8da-ddaa-4634-8c09-1a71ae19c58f/memcached/0.log" Mar 13 21:26:38 crc kubenswrapper[4790]: I0313 21:26:38.856401 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/util/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.083791 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/util/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.085455 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/pull/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.122597 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/pull/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.332914 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/util/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.340981 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/extract/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.348929 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_a89c8ac2517d4a03b907c81221df4e2b81e414ad200438c5b5185b1fefvdxlk_4f787e63-2dda-4c6f-9c43-0b61658fed8c/pull/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.658099 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-d47688694-s8p67_bdbe5269-1150-4269-bc28-1d719f1b77b6/manager/0.log" Mar 13 21:26:39 crc kubenswrapper[4790]: I0313 21:26:39.799075 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-66d56f6ff4-h7rc9_46fb44a5-f567-4f58-80b1-dd70694f9339/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.039220 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-5964f64c48-tzx96_e154cc44-2769-4bfe-b8ef-3f6c56f08f74/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.053289 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-77b6666d85-q5nj7_a7488d00-50bc-4ce8-ae0a-8d3ff807c0da/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.256671 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-6d9d6b584d-nzdzx_460b6997-f558-4e5f-9e15-aa33fece4f4b/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.480904 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-984cd4dcf-5plwh_dd8df218-c492-4e48-93a9-f5f2dbf7fc00/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.575657 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-5bc894d9b-wfltj_2747d064-d45f-4a4e-87c2-d2c9f82eac10/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.757613 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-54dc5b8f8d-jrr7h_7caf7136-8a46-410b-8a32-72ab19e8baca/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.806298 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-684f77d66d-5vcsg_77f24ce6-bc52-4831-902c-255983a8f911/manager/0.log" Mar 13 21:26:40 crc kubenswrapper[4790]: I0313 21:26:40.979598 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-57b484b4df-hlk9s_b5a018c4-3e3a-4f77-a272-20c94a5b9c7a/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.024991 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-5b6b6b4c9f-pjd9v_5befe4e4-4574-42ac-90ce-ac67c1e33eee/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.182993 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-776c5696bf-dxntp_499aa973-6f5e-4229-9282-52c4fbf0625f/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.333860 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-7f84474648-b8lpj_386f7e46-c2e3-4eae-aa82-05075883c889/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.395059 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5f4f55cb5c-tbbfl_403c2990-8871-47da-abd8-8c9fc5753d54/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.529466 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-557ccf57b7pdqgn_5622f52e-2e94-41ca-a9d2-a0c833895937/manager/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.694472 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5c46d6fb64-bj72t_87b8083b-23ab-4733-a7ac-85bf1e565551/operator/0.log" Mar 13 21:26:41 crc kubenswrapper[4790]: I0313 21:26:41.866194 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-58vcj_db35ffd8-ac53-48ad-8035-53066c9df48b/registry-server/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.180365 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-574d45c66c-c9lbv_b36f993b-25cd-4f12-bf48-77bf6f4cf26b/manager/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.213823 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-bbc5b68f9-hwdv8_b1273818-139a-4213-b23c-609a7305c92f/manager/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.512954 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-xvrl9_22e6d110-bd87-4d28-851d-307b4223ee8f/operator/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.689621 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-7f9cc5dd44-ppzzz_0244e4ae-2ccd-482a-b490-58a8e46ab53d/manager/0.log" Mar 13 21:26:42 crc kubenswrapper[4790]: I0313 21:26:42.781513 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-6854b8b9d9-f8l4s_2032df10-91a5-4a88-9705-c355f50a5024/manager/0.log" Mar 13 21:26:43 crc kubenswrapper[4790]: I0313 21:26:43.027547 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-cfb9g_a36ba835-deb4-41f5-9b6a-57d1e577c8b1/manager/0.log" Mar 13 21:26:43 crc kubenswrapper[4790]: I0313 21:26:43.028614 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5698bc49b8-xpzcd_bf0c2c50-711c-4fbd-8c15-64bf6fc3572b/manager/0.log" Mar 13 21:26:43 crc kubenswrapper[4790]: I0313 21:26:43.151012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-5689f_47bdfeda-c97a-40b5-82f8-1008ba20e75b/manager/0.log" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.206630 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:26:51 crc kubenswrapper[4790]: E0313 21:26:51.207738 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerName="oc" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.207753 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerName="oc" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.207958 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" containerName="oc" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.209489 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.229150 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.290971 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.291150 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.291320 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.393342 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.393473 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.393540 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.394435 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.394712 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.415655 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"redhat-marketplace-4xklq\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:51 crc kubenswrapper[4790]: I0313 21:26:51.526202 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:26:52 crc kubenswrapper[4790]: I0313 21:26:52.091487 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:26:52 crc kubenswrapper[4790]: I0313 21:26:52.199654 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerStarted","Data":"fd1ca7f5767685d5f0304ca00ae3b1facfd52b9570d737e7f9d5888cc79d70ba"} Mar 13 21:26:53 crc kubenswrapper[4790]: I0313 21:26:53.210174 4790 generic.go:334] "Generic (PLEG): container finished" podID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" exitCode=0 Mar 13 21:26:53 crc kubenswrapper[4790]: I0313 21:26:53.210261 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412"} Mar 13 21:26:54 crc kubenswrapper[4790]: I0313 21:26:54.224725 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerStarted","Data":"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c"} Mar 13 21:26:55 crc kubenswrapper[4790]: I0313 21:26:55.233596 4790 generic.go:334] "Generic (PLEG): container finished" podID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" exitCode=0 Mar 13 21:26:55 crc kubenswrapper[4790]: I0313 21:26:55.233827 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c"} Mar 13 21:26:56 crc kubenswrapper[4790]: I0313 21:26:56.263200 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerStarted","Data":"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b"} Mar 13 21:26:56 crc kubenswrapper[4790]: I0313 21:26:56.292497 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4xklq" podStartSLOduration=2.740686396 podStartE2EDuration="5.292473273s" podCreationTimestamp="2026-03-13 21:26:51 +0000 UTC" firstStartedPulling="2026-03-13 21:26:53.212031572 +0000 UTC m=+3544.233147463" lastFinishedPulling="2026-03-13 21:26:55.763818449 +0000 UTC m=+3546.784934340" observedRunningTime="2026-03-13 21:26:56.286931512 +0000 UTC m=+3547.308047413" watchObservedRunningTime="2026-03-13 21:26:56.292473273 +0000 UTC m=+3547.313589164" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.526355 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.526893 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.580702 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:01 crc kubenswrapper[4790]: I0313 21:27:01.947794 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qr47g_b8f95d7e-96c6-475c-8bef-d72937cc36b4/control-plane-machine-set-operator/0.log" Mar 13 21:27:02 crc kubenswrapper[4790]: I0313 21:27:02.158552 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jfdgz_a626166a-5d74-4dd9-b838-746731bfedef/machine-api-operator/0.log" Mar 13 21:27:02 crc kubenswrapper[4790]: I0313 21:27:02.177236 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jfdgz_a626166a-5d74-4dd9-b838-746731bfedef/kube-rbac-proxy/0.log" Mar 13 21:27:02 crc kubenswrapper[4790]: I0313 21:27:02.367443 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:03 crc kubenswrapper[4790]: I0313 21:27:03.369583 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.321871 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4xklq" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" containerID="cri-o://a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" gracePeriod=2 Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.789046 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.939489 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") pod \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.939766 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") pod \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.939862 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") pod \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\" (UID: \"a2e7f774-1873-40b8-a8d8-bf1e8677b87b\") " Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.941108 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities" (OuterVolumeSpecName: "utilities") pod "a2e7f774-1873-40b8-a8d8-bf1e8677b87b" (UID: "a2e7f774-1873-40b8-a8d8-bf1e8677b87b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.941738 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.955289 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj" (OuterVolumeSpecName: "kube-api-access-jssdj") pod "a2e7f774-1873-40b8-a8d8-bf1e8677b87b" (UID: "a2e7f774-1873-40b8-a8d8-bf1e8677b87b"). InnerVolumeSpecName "kube-api-access-jssdj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:27:04 crc kubenswrapper[4790]: I0313 21:27:04.985149 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2e7f774-1873-40b8-a8d8-bf1e8677b87b" (UID: "a2e7f774-1873-40b8-a8d8-bf1e8677b87b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.043334 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jssdj\" (UniqueName: \"kubernetes.io/projected/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-kube-api-access-jssdj\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.043365 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2e7f774-1873-40b8-a8d8-bf1e8677b87b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346406 4790 generic.go:334] "Generic (PLEG): container finished" podID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" exitCode=0 Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346448 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b"} Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346478 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4xklq" event={"ID":"a2e7f774-1873-40b8-a8d8-bf1e8677b87b","Type":"ContainerDied","Data":"fd1ca7f5767685d5f0304ca00ae3b1facfd52b9570d737e7f9d5888cc79d70ba"} Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346500 4790 scope.go:117] "RemoveContainer" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.346643 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4xklq" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.375289 4790 scope.go:117] "RemoveContainer" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.395710 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.402012 4790 scope.go:117] "RemoveContainer" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.414936 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4xklq"] Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.442680 4790 scope.go:117] "RemoveContainer" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" Mar 13 21:27:05 crc kubenswrapper[4790]: E0313 21:27:05.443144 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b\": container with ID starting with a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b not found: ID does not exist" containerID="a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443186 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b"} err="failed to get container status \"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b\": rpc error: code = NotFound desc = could not find container \"a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b\": container with ID starting with a3744c9d2c2dede951242b44e64d89d107ad82d8f4246d7a32174439e7a0d45b not found: ID does not exist" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443210 4790 scope.go:117] "RemoveContainer" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" Mar 13 21:27:05 crc kubenswrapper[4790]: E0313 21:27:05.443498 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c\": container with ID starting with 16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c not found: ID does not exist" containerID="16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443543 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c"} err="failed to get container status \"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c\": rpc error: code = NotFound desc = could not find container \"16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c\": container with ID starting with 16dcdd23560e93cad4c03de311ef5cb7ef83ab51b72d94e990c3ff8c3449673c not found: ID does not exist" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443570 4790 scope.go:117] "RemoveContainer" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" Mar 13 21:27:05 crc kubenswrapper[4790]: E0313 21:27:05.443842 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412\": container with ID starting with f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412 not found: ID does not exist" containerID="f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.443873 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412"} err="failed to get container status \"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412\": rpc error: code = NotFound desc = could not find container \"f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412\": container with ID starting with f363ecae272085b309f07f3940ee5ef33158a7523e05c6c2f1bbf9cff7d9e412 not found: ID does not exist" Mar 13 21:27:05 crc kubenswrapper[4790]: I0313 21:27:05.672024 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" path="/var/lib/kubelet/pods/a2e7f774-1873-40b8-a8d8-bf1e8677b87b/volumes" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.776169 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:07 crc kubenswrapper[4790]: E0313 21:27:07.776986 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.776997 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" Mar 13 21:27:07 crc kubenswrapper[4790]: E0313 21:27:07.777017 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-content" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.777023 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-content" Mar 13 21:27:07 crc kubenswrapper[4790]: E0313 21:27:07.777039 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-utilities" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.777045 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="extract-utilities" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.777260 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2e7f774-1873-40b8-a8d8-bf1e8677b87b" containerName="registry-server" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.778665 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.794590 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.796040 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.796107 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.796133 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897191 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897282 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897318 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897643 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.897952 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:07 crc kubenswrapper[4790]: I0313 21:27:07.936743 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"certified-operators-mlctx\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:08 crc kubenswrapper[4790]: I0313 21:27:08.096567 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:08 crc kubenswrapper[4790]: I0313 21:27:08.682753 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:09 crc kubenswrapper[4790]: I0313 21:27:09.383029 4790 generic.go:334] "Generic (PLEG): container finished" podID="986a26ad-d48a-428b-99bb-7684ea902e87" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" exitCode=0 Mar 13 21:27:09 crc kubenswrapper[4790]: I0313 21:27:09.383142 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6"} Mar 13 21:27:09 crc kubenswrapper[4790]: I0313 21:27:09.383358 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerStarted","Data":"082cf7f16d850b1cc1a9caea3f6f0a73b6f9737403c81ae02f070b30089db739"} Mar 13 21:27:10 crc kubenswrapper[4790]: I0313 21:27:10.396938 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerStarted","Data":"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644"} Mar 13 21:27:11 crc kubenswrapper[4790]: I0313 21:27:11.407281 4790 generic.go:334] "Generic (PLEG): container finished" podID="986a26ad-d48a-428b-99bb-7684ea902e87" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" exitCode=0 Mar 13 21:27:11 crc kubenswrapper[4790]: I0313 21:27:11.407329 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644"} Mar 13 21:27:12 crc kubenswrapper[4790]: I0313 21:27:12.417217 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerStarted","Data":"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035"} Mar 13 21:27:12 crc kubenswrapper[4790]: I0313 21:27:12.437167 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mlctx" podStartSLOduration=3.040953993 podStartE2EDuration="5.437148686s" podCreationTimestamp="2026-03-13 21:27:07 +0000 UTC" firstStartedPulling="2026-03-13 21:27:09.385359809 +0000 UTC m=+3560.406475700" lastFinishedPulling="2026-03-13 21:27:11.781554502 +0000 UTC m=+3562.802670393" observedRunningTime="2026-03-13 21:27:12.433712892 +0000 UTC m=+3563.454828773" watchObservedRunningTime="2026-03-13 21:27:12.437148686 +0000 UTC m=+3563.458264577" Mar 13 21:27:16 crc kubenswrapper[4790]: I0313 21:27:16.077799 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-fgq7z_c77372fb-0649-4c32-be4f-34c3dd515246/cert-manager-controller/0.log" Mar 13 21:27:16 crc kubenswrapper[4790]: I0313 21:27:16.236319 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-vfjwg_f58ec868-a42c-463c-b65f-bf118fae6518/cert-manager-cainjector/0.log" Mar 13 21:27:16 crc kubenswrapper[4790]: I0313 21:27:16.308470 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-p4h8t_1430c143-e235-49e5-a141-78b9e3297b70/cert-manager-webhook/0.log" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.096997 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.097308 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.141905 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.512500 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:18 crc kubenswrapper[4790]: I0313 21:27:18.578965 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.482998 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mlctx" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" containerID="cri-o://428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" gracePeriod=2 Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.933597 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.967720 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") pod \"986a26ad-d48a-428b-99bb-7684ea902e87\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.967988 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") pod \"986a26ad-d48a-428b-99bb-7684ea902e87\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.968017 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") pod \"986a26ad-d48a-428b-99bb-7684ea902e87\" (UID: \"986a26ad-d48a-428b-99bb-7684ea902e87\") " Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.968891 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities" (OuterVolumeSpecName: "utilities") pod "986a26ad-d48a-428b-99bb-7684ea902e87" (UID: "986a26ad-d48a-428b-99bb-7684ea902e87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:20 crc kubenswrapper[4790]: I0313 21:27:20.974055 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn" (OuterVolumeSpecName: "kube-api-access-gftjn") pod "986a26ad-d48a-428b-99bb-7684ea902e87" (UID: "986a26ad-d48a-428b-99bb-7684ea902e87"). InnerVolumeSpecName "kube-api-access-gftjn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.036996 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "986a26ad-d48a-428b-99bb-7684ea902e87" (UID: "986a26ad-d48a-428b-99bb-7684ea902e87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.070123 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gftjn\" (UniqueName: \"kubernetes.io/projected/986a26ad-d48a-428b-99bb-7684ea902e87-kube-api-access-gftjn\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.070408 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.070519 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/986a26ad-d48a-428b-99bb-7684ea902e87-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496317 4790 generic.go:334] "Generic (PLEG): container finished" podID="986a26ad-d48a-428b-99bb-7684ea902e87" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" exitCode=0 Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496400 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mlctx" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496411 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035"} Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496729 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mlctx" event={"ID":"986a26ad-d48a-428b-99bb-7684ea902e87","Type":"ContainerDied","Data":"082cf7f16d850b1cc1a9caea3f6f0a73b6f9737403c81ae02f070b30089db739"} Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.496749 4790 scope.go:117] "RemoveContainer" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.534840 4790 scope.go:117] "RemoveContainer" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.538906 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.550246 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mlctx"] Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.577477 4790 scope.go:117] "RemoveContainer" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598052 4790 scope.go:117] "RemoveContainer" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" Mar 13 21:27:21 crc kubenswrapper[4790]: E0313 21:27:21.598385 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035\": container with ID starting with 428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035 not found: ID does not exist" containerID="428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598430 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035"} err="failed to get container status \"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035\": rpc error: code = NotFound desc = could not find container \"428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035\": container with ID starting with 428dfeedcb7c720ef54b32a5a9360796878b0a501fc2388a5acbe7f7bbf51035 not found: ID does not exist" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598454 4790 scope.go:117] "RemoveContainer" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" Mar 13 21:27:21 crc kubenswrapper[4790]: E0313 21:27:21.598858 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644\": container with ID starting with 3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644 not found: ID does not exist" containerID="3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598887 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644"} err="failed to get container status \"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644\": rpc error: code = NotFound desc = could not find container \"3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644\": container with ID starting with 3d87805110f897c1508908768643f7561833b67f4c655e4e52294c59914f6644 not found: ID does not exist" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.598907 4790 scope.go:117] "RemoveContainer" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" Mar 13 21:27:21 crc kubenswrapper[4790]: E0313 21:27:21.599162 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6\": container with ID starting with 8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6 not found: ID does not exist" containerID="8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.599186 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6"} err="failed to get container status \"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6\": rpc error: code = NotFound desc = could not find container \"8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6\": container with ID starting with 8b0ee3586781cc82cc74300b4b12542f4c5fae45fe46865aef2a309d7c1372d6 not found: ID does not exist" Mar 13 21:27:21 crc kubenswrapper[4790]: I0313 21:27:21.669868 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" path="/var/lib/kubelet/pods/986a26ad-d48a-428b-99bb-7684ea902e87/volumes" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.425519 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-k8mcs_c7ef6baa-3c87-44a8-91d2-bcfbc0696396/nmstate-console-plugin/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.572872 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-b2697_d5c9a572-635b-4ecc-a2a4-c7e459d6d510/nmstate-handler/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.631727 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wvv95_4295503b-996b-4a20-844b-07a90de225a6/kube-rbac-proxy/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.720408 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-wvv95_4295503b-996b-4a20-844b-07a90de225a6/nmstate-metrics/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.832012 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-4lvtv_4d5f9755-21a7-482e-8788-85ed86738b40/nmstate-operator/0.log" Mar 13 21:27:29 crc kubenswrapper[4790]: I0313 21:27:29.910744 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-qld4w_e1a3b709-858c-4bca-b52b-c96dc23d9149/nmstate-webhook/0.log" Mar 13 21:27:55 crc kubenswrapper[4790]: I0313 21:27:55.794747 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-czl9k_d5ef8654-e56f-454b-9fae-0753a30dab0f/kube-rbac-proxy/0.log" Mar 13 21:27:55 crc kubenswrapper[4790]: I0313 21:27:55.867054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-czl9k_d5ef8654-e56f-454b-9fae-0753a30dab0f/controller/0.log" Mar 13 21:27:55 crc kubenswrapper[4790]: I0313 21:27:55.981033 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.197583 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.229873 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.245493 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.252252 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.461213 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.487819 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.498684 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.532949 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.644273 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-frr-files/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.653958 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-reloader/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.678993 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/cp-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.729579 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/controller/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.827322 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/frr-metrics/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.865349 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/kube-rbac-proxy/0.log" Mar 13 21:27:56 crc kubenswrapper[4790]: I0313 21:27:56.949055 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/kube-rbac-proxy-frr/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.035617 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/reloader/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.173223 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-8ckr8_472cc73a-53fe-4d7c-aec8-b2154023ba90/frr-k8s-webhook-server/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.322932 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6c885c8d8c-fcv54_da23093d-500f-43f4-805a-b4a252e40940/manager/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.450546 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-76c9b767d4-c6mq2_783be831-b522-42a0-9cbe-f234ed3a027c/webhook-server/0.log" Mar 13 21:27:57 crc kubenswrapper[4790]: I0313 21:27:57.681751 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5tk2m_a3729738-ead5-47e0-95de-04dc39fb0516/kube-rbac-proxy/0.log" Mar 13 21:27:58 crc kubenswrapper[4790]: I0313 21:27:58.293411 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5tk2m_a3729738-ead5-47e0-95de-04dc39fb0516/speaker/0.log" Mar 13 21:27:58 crc kubenswrapper[4790]: I0313 21:27:58.516228 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-r97zs_3ab7e856-a311-4e29-aabf-adaa27363613/frr/0.log" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.143726 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:28:00 crc kubenswrapper[4790]: E0313 21:28:00.144473 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-content" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144490 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-content" Mar 13 21:28:00 crc kubenswrapper[4790]: E0313 21:28:00.144518 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-utilities" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144525 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="extract-utilities" Mar 13 21:28:00 crc kubenswrapper[4790]: E0313 21:28:00.144549 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144557 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.144805 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="986a26ad-d48a-428b-99bb-7684ea902e87" containerName="registry-server" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.145706 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.147838 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.150765 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.150992 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.153202 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.281023 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"auto-csr-approver-29557288-qmdk7\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.382895 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"auto-csr-approver-29557288-qmdk7\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.428662 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"auto-csr-approver-29557288-qmdk7\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.464992 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:00 crc kubenswrapper[4790]: I0313 21:28:00.914500 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:28:01 crc kubenswrapper[4790]: I0313 21:28:01.863224 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" event={"ID":"4d3de3f1-0534-4203-b465-d512d6c80287","Type":"ContainerStarted","Data":"083d7cbcd12c287e727aed7a01b7ce6bd4531ca855019f4fdabe4a1671c16c15"} Mar 13 21:28:02 crc kubenswrapper[4790]: I0313 21:28:02.874160 4790 generic.go:334] "Generic (PLEG): container finished" podID="4d3de3f1-0534-4203-b465-d512d6c80287" containerID="c265de87623abb9a96ed933e22a3276547bc13888411d097434621497cc49ed1" exitCode=0 Mar 13 21:28:02 crc kubenswrapper[4790]: I0313 21:28:02.874213 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" event={"ID":"4d3de3f1-0534-4203-b465-d512d6c80287","Type":"ContainerDied","Data":"c265de87623abb9a96ed933e22a3276547bc13888411d097434621497cc49ed1"} Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.191303 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.358115 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") pod \"4d3de3f1-0534-4203-b465-d512d6c80287\" (UID: \"4d3de3f1-0534-4203-b465-d512d6c80287\") " Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.366054 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6" (OuterVolumeSpecName: "kube-api-access-9pqf6") pod "4d3de3f1-0534-4203-b465-d512d6c80287" (UID: "4d3de3f1-0534-4203-b465-d512d6c80287"). InnerVolumeSpecName "kube-api-access-9pqf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.460485 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pqf6\" (UniqueName: \"kubernetes.io/projected/4d3de3f1-0534-4203-b465-d512d6c80287-kube-api-access-9pqf6\") on node \"crc\" DevicePath \"\"" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.900041 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" event={"ID":"4d3de3f1-0534-4203-b465-d512d6c80287","Type":"ContainerDied","Data":"083d7cbcd12c287e727aed7a01b7ce6bd4531ca855019f4fdabe4a1671c16c15"} Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.900082 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="083d7cbcd12c287e727aed7a01b7ce6bd4531ca855019f4fdabe4a1671c16c15" Mar 13 21:28:04 crc kubenswrapper[4790]: I0313 21:28:04.900140 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557288-qmdk7" Mar 13 21:28:05 crc kubenswrapper[4790]: I0313 21:28:05.264673 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:28:05 crc kubenswrapper[4790]: I0313 21:28:05.277103 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557282-h6blc"] Mar 13 21:28:05 crc kubenswrapper[4790]: I0313 21:28:05.669519 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4a5c228-86a3-4945-8d95-44db739406d7" path="/var/lib/kubelet/pods/e4a5c228-86a3-4945-8d95-44db739406d7/volumes" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.406864 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/util/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.625446 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/pull/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.636763 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/pull/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.655025 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/util/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.789002 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/util/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.811742 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/pull/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.818448 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde87456z5v_16b499fa-d8a4-4f3f-bcaf-aa9fa7b43854/extract/0.log" Mar 13 21:28:10 crc kubenswrapper[4790]: I0313 21:28:10.981282 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/util/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.139106 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/util/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.142614 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/pull/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.165828 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/pull/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.302218 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/util/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.313006 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/extract/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.326191 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1df9px_6940903a-9dc5-4001-bc87-9de2bdce9e52/pull/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.467352 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-utilities/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.636988 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-content/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.639003 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-utilities/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.652188 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-content/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.858859 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-utilities/0.log" Mar 13 21:28:11 crc kubenswrapper[4790]: I0313 21:28:11.898336 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.117580 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-utilities/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.354989 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-utilities/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.381109 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.424509 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.454503 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-q5brt_9e374399-85bd-4121-9352-23a37bdf41f3/registry-server/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.592054 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-utilities/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.616545 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/extract-content/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.822610 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-n548b_97fe66e8-7366-4c61-b1db-4d98459834da/marketplace-operator/0.log" Mar 13 21:28:12 crc kubenswrapper[4790]: I0313 21:28:12.925933 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.010057 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-qcdqx_2ab722e2-16ac-40ba-9c44-903bf6bb8db8/registry-server/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.099328 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.148876 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.174575 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.316130 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.359795 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.534901 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-zhllj_0af82375-cffb-4861-82d2-5f1a0e4a8496/registry-server/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.538915 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.776780 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-utilities/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.798544 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-content/0.log" Mar 13 21:28:13 crc kubenswrapper[4790]: I0313 21:28:13.841222 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-content/0.log" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.008512 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-utilities/0.log" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.015303 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.015523 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.079873 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/extract-content/0.log" Mar 13 21:28:14 crc kubenswrapper[4790]: I0313 21:28:14.634706 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-b7b2s_5ad984b4-e6a7-4559-99e4-02a03eda6303/registry-server/0.log" Mar 13 21:28:19 crc kubenswrapper[4790]: I0313 21:28:19.754903 4790 scope.go:117] "RemoveContainer" containerID="5019beb318c0070d1f51637c47bb15945a64aa1c344d598234b2e66e74401ef0" Mar 13 21:28:44 crc kubenswrapper[4790]: I0313 21:28:44.015988 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:28:44 crc kubenswrapper[4790]: I0313 21:28:44.016581 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.015772 4790 patch_prober.go:28] interesting pod/machine-config-daemon-drtsx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.016424 4790 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.016490 4790 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.017543 4790 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80"} pod="openshift-machine-config-operator/machine-config-daemon-drtsx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.017629 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" containerName="machine-config-daemon" containerID="cri-o://0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" gracePeriod=600 Mar 13 21:29:14 crc kubenswrapper[4790]: E0313 21:29:14.136536 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476119 4790 generic.go:334] "Generic (PLEG): container finished" podID="58464a30-7f56-4e13-894e-e53498a85637" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" exitCode=0 Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476193 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerDied","Data":"0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80"} Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476258 4790 scope.go:117] "RemoveContainer" containerID="5e764877937c3d83a4b1853363d471bb75b0ef968565309da1f28c291b8d45e7" Mar 13 21:29:14 crc kubenswrapper[4790]: I0313 21:29:14.476940 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:14 crc kubenswrapper[4790]: E0313 21:29:14.477451 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:28 crc kubenswrapper[4790]: I0313 21:29:28.661800 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:28 crc kubenswrapper[4790]: E0313 21:29:28.662825 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:40 crc kubenswrapper[4790]: I0313 21:29:40.660414 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:40 crc kubenswrapper[4790]: E0313 21:29:40.661296 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:29:55 crc kubenswrapper[4790]: I0313 21:29:55.659925 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:29:55 crc kubenswrapper[4790]: E0313 21:29:55.662026 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.153751 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557290-2w2zk"] Mar 13 21:30:00 crc kubenswrapper[4790]: E0313 21:30:00.154838 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" containerName="oc" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.154857 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" containerName="oc" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.155149 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" containerName="oc" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.156041 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.158706 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.158714 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.166407 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.166654 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.167880 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.169670 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.175172 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.179848 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557290-2w2zk"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.193282 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.255281 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.255645 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"auto-csr-approver-29557290-2w2zk\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.255809 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.256163 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.357857 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.357970 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"auto-csr-approver-29557290-2w2zk\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.358037 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.358145 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.360749 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.364299 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.377437 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"auto-csr-approver-29557290-2w2zk\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.388405 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"collect-profiles-29557290-8rn8d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.481933 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.488833 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.954915 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557290-2w2zk"] Mar 13 21:30:00 crc kubenswrapper[4790]: I0313 21:30:00.958447 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:30:01 crc kubenswrapper[4790]: W0313 21:30:01.035865 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3891db6b_832a_4f78_9d91_2945136ac41d.slice/crio-7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10 WatchSource:0}: Error finding container 7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10: Status 404 returned error can't find the container with id 7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10 Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.052611 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d"] Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.920092 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerStarted","Data":"0a0cecd5c3e5fda8f03d676b699f878194e10eb22bb6b8ec985f0446cc2c1fa3"} Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.922628 4790 generic.go:334] "Generic (PLEG): container finished" podID="3891db6b-832a-4f78-9d91-2945136ac41d" containerID="8166015613c291721772d0950e6a082638300ea94be5c443065d9e3891ab62a3" exitCode=0 Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.922667 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" event={"ID":"3891db6b-832a-4f78-9d91-2945136ac41d","Type":"ContainerDied","Data":"8166015613c291721772d0950e6a082638300ea94be5c443065d9e3891ab62a3"} Mar 13 21:30:01 crc kubenswrapper[4790]: I0313 21:30:01.922687 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" event={"ID":"3891db6b-832a-4f78-9d91-2945136ac41d","Type":"ContainerStarted","Data":"7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10"} Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.933796 4790 generic.go:334] "Generic (PLEG): container finished" podID="09855131-fcae-4c41-83c2-2874fd6e7068" containerID="3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8" exitCode=0 Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.933943 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" event={"ID":"09855131-fcae-4c41-83c2-2874fd6e7068","Type":"ContainerDied","Data":"3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8"} Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.934939 4790 scope.go:117] "RemoveContainer" containerID="3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8" Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.935909 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerStarted","Data":"8545a35dacc04bd1f83edfa9e8f634e87f549dc58a48933c26d35edf437fcb49"} Mar 13 21:30:02 crc kubenswrapper[4790]: I0313 21:30:02.973058 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" podStartSLOduration=1.398502391 podStartE2EDuration="2.973031851s" podCreationTimestamp="2026-03-13 21:30:00 +0000 UTC" firstStartedPulling="2026-03-13 21:30:00.958191893 +0000 UTC m=+3731.979307784" lastFinishedPulling="2026-03-13 21:30:02.532721363 +0000 UTC m=+3733.553837244" observedRunningTime="2026-03-13 21:30:02.967170891 +0000 UTC m=+3733.988286782" watchObservedRunningTime="2026-03-13 21:30:02.973031851 +0000 UTC m=+3733.994147742" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.263824 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.323326 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") pod \"3891db6b-832a-4f78-9d91-2945136ac41d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.323531 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") pod \"3891db6b-832a-4f78-9d91-2945136ac41d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.323574 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") pod \"3891db6b-832a-4f78-9d91-2945136ac41d\" (UID: \"3891db6b-832a-4f78-9d91-2945136ac41d\") " Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.324711 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume" (OuterVolumeSpecName: "config-volume") pod "3891db6b-832a-4f78-9d91-2945136ac41d" (UID: "3891db6b-832a-4f78-9d91-2945136ac41d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.331630 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29" (OuterVolumeSpecName: "kube-api-access-szz29") pod "3891db6b-832a-4f78-9d91-2945136ac41d" (UID: "3891db6b-832a-4f78-9d91-2945136ac41d"). InnerVolumeSpecName "kube-api-access-szz29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.331745 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3891db6b-832a-4f78-9d91-2945136ac41d" (UID: "3891db6b-832a-4f78-9d91-2945136ac41d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.397637 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/gather/0.log" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.425724 4790 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3891db6b-832a-4f78-9d91-2945136ac41d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.425758 4790 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3891db6b-832a-4f78-9d91-2945136ac41d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.425770 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szz29\" (UniqueName: \"kubernetes.io/projected/3891db6b-832a-4f78-9d91-2945136ac41d-kube-api-access-szz29\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.949152 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" event={"ID":"3891db6b-832a-4f78-9d91-2945136ac41d","Type":"ContainerDied","Data":"7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10"} Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.949509 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7422975de7713d05618db81d72e5d50a01a667ab335640040ec85093c38b0a10" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.949199 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29557290-8rn8d" Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.950934 4790 generic.go:334] "Generic (PLEG): container finished" podID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerID="8545a35dacc04bd1f83edfa9e8f634e87f549dc58a48933c26d35edf437fcb49" exitCode=0 Mar 13 21:30:03 crc kubenswrapper[4790]: I0313 21:30:03.950981 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerDied","Data":"8545a35dacc04bd1f83edfa9e8f634e87f549dc58a48933c26d35edf437fcb49"} Mar 13 21:30:04 crc kubenswrapper[4790]: I0313 21:30:04.331576 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 21:30:04 crc kubenswrapper[4790]: I0313 21:30:04.339966 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29557245-5vhkw"] Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.334493 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.479585 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") pod \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\" (UID: \"c83c8e92-0c0b-4b43-8391-1c63b8755c64\") " Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.487035 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4" (OuterVolumeSpecName: "kube-api-access-lzbs4") pod "c83c8e92-0c0b-4b43-8391-1c63b8755c64" (UID: "c83c8e92-0c0b-4b43-8391-1c63b8755c64"). InnerVolumeSpecName "kube-api-access-lzbs4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.582004 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzbs4\" (UniqueName: \"kubernetes.io/projected/c83c8e92-0c0b-4b43-8391-1c63b8755c64-kube-api-access-lzbs4\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.670997 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0001db4d-b91a-473e-bfff-794d8663885f" path="/var/lib/kubelet/pods/0001db4d-b91a-473e-bfff-794d8663885f/volumes" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.970618 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" event={"ID":"c83c8e92-0c0b-4b43-8391-1c63b8755c64","Type":"ContainerDied","Data":"0a0cecd5c3e5fda8f03d676b699f878194e10eb22bb6b8ec985f0446cc2c1fa3"} Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.970831 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a0cecd5c3e5fda8f03d676b699f878194e10eb22bb6b8ec985f0446cc2c1fa3" Mar 13 21:30:05 crc kubenswrapper[4790]: I0313 21:30:05.970704 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557290-2w2zk" Mar 13 21:30:06 crc kubenswrapper[4790]: I0313 21:30:06.384586 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:30:06 crc kubenswrapper[4790]: I0313 21:30:06.392367 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557284-fszb4"] Mar 13 21:30:07 crc kubenswrapper[4790]: I0313 21:30:07.671467 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e71263f0-7309-4046-b71d-2ae38e13d27c" path="/var/lib/kubelet/pods/e71263f0-7309-4046-b71d-2ae38e13d27c/volumes" Mar 13 21:30:09 crc kubenswrapper[4790]: I0313 21:30:09.670694 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:09 crc kubenswrapper[4790]: E0313 21:30:09.671298 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:11 crc kubenswrapper[4790]: I0313 21:30:11.695591 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:30:11 crc kubenswrapper[4790]: I0313 21:30:11.695854 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" containerID="cri-o://f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb" gracePeriod=2 Mar 13 21:30:11 crc kubenswrapper[4790]: I0313 21:30:11.710276 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6kfv/must-gather-qf7z2"] Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.022139 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/copy/0.log" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.023164 4790 generic.go:334] "Generic (PLEG): container finished" podID="09855131-fcae-4c41-83c2-2874fd6e7068" containerID="f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb" exitCode=143 Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.202643 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/copy/0.log" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.203191 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.315505 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") pod \"09855131-fcae-4c41-83c2-2874fd6e7068\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.316052 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") pod \"09855131-fcae-4c41-83c2-2874fd6e7068\" (UID: \"09855131-fcae-4c41-83c2-2874fd6e7068\") " Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.335605 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf" (OuterVolumeSpecName: "kube-api-access-hnqnf") pod "09855131-fcae-4c41-83c2-2874fd6e7068" (UID: "09855131-fcae-4c41-83c2-2874fd6e7068"). InnerVolumeSpecName "kube-api-access-hnqnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.417697 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hnqnf\" (UniqueName: \"kubernetes.io/projected/09855131-fcae-4c41-83c2-2874fd6e7068-kube-api-access-hnqnf\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.474317 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "09855131-fcae-4c41-83c2-2874fd6e7068" (UID: "09855131-fcae-4c41-83c2-2874fd6e7068"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:30:12 crc kubenswrapper[4790]: I0313 21:30:12.519266 4790 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/09855131-fcae-4c41-83c2-2874fd6e7068-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.033313 4790 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6kfv_must-gather-qf7z2_09855131-fcae-4c41-83c2-2874fd6e7068/copy/0.log" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.034907 4790 scope.go:117] "RemoveContainer" containerID="f2a116706cb391169c51f4180351f0429f8c305252cf4438d5b41c53f1d8a0cb" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.034932 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6kfv/must-gather-qf7z2" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.062301 4790 scope.go:117] "RemoveContainer" containerID="3a1d32bb413765095ebca93898109c48096d1087d7c63a4448ef4a85e11734c8" Mar 13 21:30:13 crc kubenswrapper[4790]: I0313 21:30:13.669397 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" path="/var/lib/kubelet/pods/09855131-fcae-4c41-83c2-2874fd6e7068/volumes" Mar 13 21:30:19 crc kubenswrapper[4790]: I0313 21:30:19.889257 4790 scope.go:117] "RemoveContainer" containerID="0e3d04fd35f846d0f8577da19c18befcb486539f0a1127e22cf8b9a5e5547ef3" Mar 13 21:30:19 crc kubenswrapper[4790]: I0313 21:30:19.928757 4790 scope.go:117] "RemoveContainer" containerID="b6265fc857b5a799a558f01ccfe69d069d440ad15cd4409b5956f9cdc01bead3" Mar 13 21:30:23 crc kubenswrapper[4790]: I0313 21:30:23.659571 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:23 crc kubenswrapper[4790]: E0313 21:30:23.660513 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:36 crc kubenswrapper[4790]: I0313 21:30:36.659366 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:36 crc kubenswrapper[4790]: E0313 21:30:36.662049 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:30:50 crc kubenswrapper[4790]: I0313 21:30:50.660783 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:30:50 crc kubenswrapper[4790]: E0313 21:30:50.662029 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:05 crc kubenswrapper[4790]: I0313 21:31:05.660445 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:05 crc kubenswrapper[4790]: E0313 21:31:05.661084 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:19 crc kubenswrapper[4790]: I0313 21:31:19.669905 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:19 crc kubenswrapper[4790]: E0313 21:31:19.670754 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:34 crc kubenswrapper[4790]: I0313 21:31:34.659911 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:34 crc kubenswrapper[4790]: E0313 21:31:34.660681 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:48 crc kubenswrapper[4790]: I0313 21:31:48.660084 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:48 crc kubenswrapper[4790]: E0313 21:31:48.660835 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:31:59 crc kubenswrapper[4790]: I0313 21:31:59.670292 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:31:59 crc kubenswrapper[4790]: E0313 21:31:59.672267 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.142687 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557292-j9s66"] Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143344 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143360 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143487 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerName="oc" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143504 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerName="oc" Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143533 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3891db6b-832a-4f78-9d91-2945136ac41d" containerName="collect-profiles" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143541 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="3891db6b-832a-4f78-9d91-2945136ac41d" containerName="collect-profiles" Mar 13 21:32:00 crc kubenswrapper[4790]: E0313 21:32:00.143560 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="gather" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143571 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="gather" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143793 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="gather" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143807 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="3891db6b-832a-4f78-9d91-2945136ac41d" containerName="collect-profiles" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143831 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="c83c8e92-0c0b-4b43-8391-1c63b8755c64" containerName="oc" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.143848 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="09855131-fcae-4c41-83c2-2874fd6e7068" containerName="copy" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.144589 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.148669 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.148787 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.148855 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.154132 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557292-j9s66"] Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.216207 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"auto-csr-approver-29557292-j9s66\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.318010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"auto-csr-approver-29557292-j9s66\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.342706 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"auto-csr-approver-29557292-j9s66\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.529859 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:00 crc kubenswrapper[4790]: I0313 21:32:00.958769 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557292-j9s66"] Mar 13 21:32:01 crc kubenswrapper[4790]: I0313 21:32:01.966707 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557292-j9s66" event={"ID":"6fa77812-592b-430d-b5db-15fab10e53e4","Type":"ContainerStarted","Data":"cd23023c8aff4e6acd5a4837074b7bbf1bbe449797445fb899e0f6e65d54b90f"} Mar 13 21:32:02 crc kubenswrapper[4790]: I0313 21:32:02.979627 4790 generic.go:334] "Generic (PLEG): container finished" podID="6fa77812-592b-430d-b5db-15fab10e53e4" containerID="d84e80b86d9cdab806e9b9bb5e0fdea7bd8634254927c159e0360e5f07881efd" exitCode=0 Mar 13 21:32:02 crc kubenswrapper[4790]: I0313 21:32:02.979757 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557292-j9s66" event={"ID":"6fa77812-592b-430d-b5db-15fab10e53e4","Type":"ContainerDied","Data":"d84e80b86d9cdab806e9b9bb5e0fdea7bd8634254927c159e0360e5f07881efd"} Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.338023 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.421494 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") pod \"6fa77812-592b-430d-b5db-15fab10e53e4\" (UID: \"6fa77812-592b-430d-b5db-15fab10e53e4\") " Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.427491 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8" (OuterVolumeSpecName: "kube-api-access-ckbc8") pod "6fa77812-592b-430d-b5db-15fab10e53e4" (UID: "6fa77812-592b-430d-b5db-15fab10e53e4"). InnerVolumeSpecName "kube-api-access-ckbc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.524215 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckbc8\" (UniqueName: \"kubernetes.io/projected/6fa77812-592b-430d-b5db-15fab10e53e4-kube-api-access-ckbc8\") on node \"crc\" DevicePath \"\"" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.997768 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557292-j9s66" event={"ID":"6fa77812-592b-430d-b5db-15fab10e53e4","Type":"ContainerDied","Data":"cd23023c8aff4e6acd5a4837074b7bbf1bbe449797445fb899e0f6e65d54b90f"} Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.997805 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd23023c8aff4e6acd5a4837074b7bbf1bbe449797445fb899e0f6e65d54b90f" Mar 13 21:32:04 crc kubenswrapper[4790]: I0313 21:32:04.997827 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557292-j9s66" Mar 13 21:32:05 crc kubenswrapper[4790]: I0313 21:32:05.410816 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:32:05 crc kubenswrapper[4790]: I0313 21:32:05.419183 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557286-j2hgs"] Mar 13 21:32:05 crc kubenswrapper[4790]: I0313 21:32:05.670793 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93d597e8-cf42-4f34-a6c1-ffe9416a562b" path="/var/lib/kubelet/pods/93d597e8-cf42-4f34-a6c1-ffe9416a562b/volumes" Mar 13 21:32:13 crc kubenswrapper[4790]: I0313 21:32:13.660537 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:13 crc kubenswrapper[4790]: E0313 21:32:13.661809 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:20 crc kubenswrapper[4790]: I0313 21:32:20.081606 4790 scope.go:117] "RemoveContainer" containerID="19ec3b81cfc93adcffb8135210e1ea8d379fb945e3eddc6ee978b60b4ce52405" Mar 13 21:32:20 crc kubenswrapper[4790]: I0313 21:32:20.105412 4790 scope.go:117] "RemoveContainer" containerID="d255e7ab1f308e1f21736aa4f57843906cd9283c436db74b17e9a79b7ff4810a" Mar 13 21:32:28 crc kubenswrapper[4790]: I0313 21:32:28.660230 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:28 crc kubenswrapper[4790]: E0313 21:32:28.661110 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:39 crc kubenswrapper[4790]: I0313 21:32:39.667508 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:39 crc kubenswrapper[4790]: E0313 21:32:39.668336 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:32:51 crc kubenswrapper[4790]: I0313 21:32:51.659668 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:32:51 crc kubenswrapper[4790]: E0313 21:32:51.660435 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:05 crc kubenswrapper[4790]: I0313 21:33:05.660620 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:05 crc kubenswrapper[4790]: E0313 21:33:05.661623 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:16 crc kubenswrapper[4790]: I0313 21:33:16.659863 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:16 crc kubenswrapper[4790]: E0313 21:33:16.660656 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:27 crc kubenswrapper[4790]: I0313 21:33:27.659668 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:27 crc kubenswrapper[4790]: E0313 21:33:27.660477 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:39 crc kubenswrapper[4790]: I0313 21:33:39.665653 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:39 crc kubenswrapper[4790]: E0313 21:33:39.666360 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:33:53 crc kubenswrapper[4790]: I0313 21:33:53.660712 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:33:53 crc kubenswrapper[4790]: E0313 21:33:53.661792 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.141273 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557294-pd2t2"] Mar 13 21:34:00 crc kubenswrapper[4790]: E0313 21:34:00.142091 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa77812-592b-430d-b5db-15fab10e53e4" containerName="oc" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.142114 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa77812-592b-430d-b5db-15fab10e53e4" containerName="oc" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.142370 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa77812-592b-430d-b5db-15fab10e53e4" containerName="oc" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.143061 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.146155 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.146686 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.146944 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.151504 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557294-pd2t2"] Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.281511 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"auto-csr-approver-29557294-pd2t2\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.384569 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"auto-csr-approver-29557294-pd2t2\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.404470 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"auto-csr-approver-29557294-pd2t2\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.465450 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:00 crc kubenswrapper[4790]: I0313 21:34:00.923184 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557294-pd2t2"] Mar 13 21:34:00 crc kubenswrapper[4790]: W0313 21:34:00.928108 4790 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod552e6693_6bb3_4722_8be8_4ed07c6c3953.slice/crio-86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c WatchSource:0}: Error finding container 86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c: Status 404 returned error can't find the container with id 86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c Mar 13 21:34:01 crc kubenswrapper[4790]: I0313 21:34:01.036979 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" event={"ID":"552e6693-6bb3-4722-8be8-4ed07c6c3953","Type":"ContainerStarted","Data":"86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c"} Mar 13 21:34:03 crc kubenswrapper[4790]: I0313 21:34:03.057515 4790 generic.go:334] "Generic (PLEG): container finished" podID="552e6693-6bb3-4722-8be8-4ed07c6c3953" containerID="2c6aa1facaa53aecef5b73114336558c2aa3bd5a0b9119d7519041f577dcea9e" exitCode=0 Mar 13 21:34:03 crc kubenswrapper[4790]: I0313 21:34:03.057612 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" event={"ID":"552e6693-6bb3-4722-8be8-4ed07c6c3953","Type":"ContainerDied","Data":"2c6aa1facaa53aecef5b73114336558c2aa3bd5a0b9119d7519041f577dcea9e"} Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.509359 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.659891 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.659941 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") pod \"552e6693-6bb3-4722-8be8-4ed07c6c3953\" (UID: \"552e6693-6bb3-4722-8be8-4ed07c6c3953\") " Mar 13 21:34:04 crc kubenswrapper[4790]: E0313 21:34:04.660304 4790 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-drtsx_openshift-machine-config-operator(58464a30-7f56-4e13-894e-e53498a85637)\"" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" podUID="58464a30-7f56-4e13-894e-e53498a85637" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.666541 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt" (OuterVolumeSpecName: "kube-api-access-j4rzt") pod "552e6693-6bb3-4722-8be8-4ed07c6c3953" (UID: "552e6693-6bb3-4722-8be8-4ed07c6c3953"). InnerVolumeSpecName "kube-api-access-j4rzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:34:04 crc kubenswrapper[4790]: I0313 21:34:04.762574 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4rzt\" (UniqueName: \"kubernetes.io/projected/552e6693-6bb3-4722-8be8-4ed07c6c3953-kube-api-access-j4rzt\") on node \"crc\" DevicePath \"\"" Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.075174 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" event={"ID":"552e6693-6bb3-4722-8be8-4ed07c6c3953","Type":"ContainerDied","Data":"86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c"} Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.075500 4790 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86159326cb16f3e9e7a3dbe79ae5f9b83352a840b38032c8f9ef2c1291a6818c" Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.075639 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557294-pd2t2" Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.576895 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.586461 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29557288-qmdk7"] Mar 13 21:34:05 crc kubenswrapper[4790]: I0313 21:34:05.668860 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d3de3f1-0534-4203-b465-d512d6c80287" path="/var/lib/kubelet/pods/4d3de3f1-0534-4203-b465-d512d6c80287/volumes" Mar 13 21:34:19 crc kubenswrapper[4790]: I0313 21:34:19.666880 4790 scope.go:117] "RemoveContainer" containerID="0583c12abd9c959bab92f13c40e5bcf138acfa34bff1e0b1b2b76d7acb3ebe80" Mar 13 21:34:20 crc kubenswrapper[4790]: I0313 21:34:20.192805 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-drtsx" event={"ID":"58464a30-7f56-4e13-894e-e53498a85637","Type":"ContainerStarted","Data":"7b5f1e5b820d638552a16c09dc9dbbd33ace522261e8adb6296689912e8ae35c"} Mar 13 21:34:20 crc kubenswrapper[4790]: I0313 21:34:20.241892 4790 scope.go:117] "RemoveContainer" containerID="c265de87623abb9a96ed933e22a3276547bc13888411d097434621497cc49ed1" Mar 13 21:35:43 crc kubenswrapper[4790]: I0313 21:35:43.807876 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lrmgj"] Mar 13 21:35:43 crc kubenswrapper[4790]: E0313 21:35:43.808904 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="552e6693-6bb3-4722-8be8-4ed07c6c3953" containerName="oc" Mar 13 21:35:43 crc kubenswrapper[4790]: I0313 21:35:43.809096 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="552e6693-6bb3-4722-8be8-4ed07c6c3953" containerName="oc" Mar 13 21:35:43 crc kubenswrapper[4790]: I0313 21:35:43.809291 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="552e6693-6bb3-4722-8be8-4ed07c6c3953" containerName="oc" Mar 13 21:35:43 crc kubenswrapper[4790]: I0313 21:35:43.830060 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:43 crc kubenswrapper[4790]: I0313 21:35:43.838578 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrmgj"] Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.016849 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7l77\" (UniqueName: \"kubernetes.io/projected/209c0f4d-97e5-4cf0-bac5-88abc967decb-kube-api-access-d7l77\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.017269 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-utilities\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.017395 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-catalog-content\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.118867 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7l77\" (UniqueName: \"kubernetes.io/projected/209c0f4d-97e5-4cf0-bac5-88abc967decb-kube-api-access-d7l77\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.118957 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-utilities\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.119098 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-catalog-content\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.119641 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-catalog-content\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.119896 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-utilities\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.139277 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7l77\" (UniqueName: \"kubernetes.io/projected/209c0f4d-97e5-4cf0-bac5-88abc967decb-kube-api-access-d7l77\") pod \"community-operators-lrmgj\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.173724 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.694365 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lrmgj"] Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.896241 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerStarted","Data":"1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456"} Mar 13 21:35:44 crc kubenswrapper[4790]: I0313 21:35:44.896593 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerStarted","Data":"d7779f370bc6b7035c8ea7ea53d41b4824922b8ee0361bd759fbee310f3449c6"} Mar 13 21:35:45 crc kubenswrapper[4790]: I0313 21:35:45.907978 4790 generic.go:334] "Generic (PLEG): container finished" podID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerID="1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456" exitCode=0 Mar 13 21:35:45 crc kubenswrapper[4790]: I0313 21:35:45.908126 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerDied","Data":"1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456"} Mar 13 21:35:45 crc kubenswrapper[4790]: I0313 21:35:45.912031 4790 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 21:35:46 crc kubenswrapper[4790]: I0313 21:35:46.802791 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jpd96"] Mar 13 21:35:46 crc kubenswrapper[4790]: I0313 21:35:46.804980 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:46 crc kubenswrapper[4790]: I0313 21:35:46.816137 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpd96"] Mar 13 21:35:46 crc kubenswrapper[4790]: I0313 21:35:46.922355 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerStarted","Data":"60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876"} Mar 13 21:35:46 crc kubenswrapper[4790]: I0313 21:35:46.972342 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5652a045-40b0-4a4e-a3b1-46f376d3efb0-catalog-content\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:46 crc kubenswrapper[4790]: I0313 21:35:46.972665 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5652a045-40b0-4a4e-a3b1-46f376d3efb0-utilities\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:46 crc kubenswrapper[4790]: I0313 21:35:46.972804 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4msk\" (UniqueName: \"kubernetes.io/projected/5652a045-40b0-4a4e-a3b1-46f376d3efb0-kube-api-access-n4msk\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.074651 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5652a045-40b0-4a4e-a3b1-46f376d3efb0-utilities\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.074723 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4msk\" (UniqueName: \"kubernetes.io/projected/5652a045-40b0-4a4e-a3b1-46f376d3efb0-kube-api-access-n4msk\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.074841 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5652a045-40b0-4a4e-a3b1-46f376d3efb0-catalog-content\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.075357 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5652a045-40b0-4a4e-a3b1-46f376d3efb0-catalog-content\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.075596 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5652a045-40b0-4a4e-a3b1-46f376d3efb0-utilities\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.100944 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4msk\" (UniqueName: \"kubernetes.io/projected/5652a045-40b0-4a4e-a3b1-46f376d3efb0-kube-api-access-n4msk\") pod \"redhat-operators-jpd96\" (UID: \"5652a045-40b0-4a4e-a3b1-46f376d3efb0\") " pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.131092 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.644090 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jpd96"] Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.932228 4790 generic.go:334] "Generic (PLEG): container finished" podID="5652a045-40b0-4a4e-a3b1-46f376d3efb0" containerID="b70799cbf98b8b988800e0611ffad01b5ce9a78c85c2277d5f9704d55aeeb0a5" exitCode=0 Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.932307 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpd96" event={"ID":"5652a045-40b0-4a4e-a3b1-46f376d3efb0","Type":"ContainerDied","Data":"b70799cbf98b8b988800e0611ffad01b5ce9a78c85c2277d5f9704d55aeeb0a5"} Mar 13 21:35:47 crc kubenswrapper[4790]: I0313 21:35:47.932366 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpd96" event={"ID":"5652a045-40b0-4a4e-a3b1-46f376d3efb0","Type":"ContainerStarted","Data":"9ded5d0bfcf7fe5ba5442fba6daf13cd0b85cee73ee0da235a2a07707af4e4fc"} Mar 13 21:35:48 crc kubenswrapper[4790]: I0313 21:35:48.942258 4790 generic.go:334] "Generic (PLEG): container finished" podID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerID="60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876" exitCode=0 Mar 13 21:35:48 crc kubenswrapper[4790]: I0313 21:35:48.942308 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerDied","Data":"60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876"} Mar 13 21:35:48 crc kubenswrapper[4790]: I0313 21:35:48.946783 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpd96" event={"ID":"5652a045-40b0-4a4e-a3b1-46f376d3efb0","Type":"ContainerStarted","Data":"c9b2a5b61c86f0c2e9a1994a9a4100e00e074431ba24e075a984f642911fb803"} Mar 13 21:35:49 crc kubenswrapper[4790]: I0313 21:35:49.956074 4790 generic.go:334] "Generic (PLEG): container finished" podID="5652a045-40b0-4a4e-a3b1-46f376d3efb0" containerID="c9b2a5b61c86f0c2e9a1994a9a4100e00e074431ba24e075a984f642911fb803" exitCode=0 Mar 13 21:35:49 crc kubenswrapper[4790]: I0313 21:35:49.956219 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpd96" event={"ID":"5652a045-40b0-4a4e-a3b1-46f376d3efb0","Type":"ContainerDied","Data":"c9b2a5b61c86f0c2e9a1994a9a4100e00e074431ba24e075a984f642911fb803"} Mar 13 21:35:49 crc kubenswrapper[4790]: I0313 21:35:49.959759 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerStarted","Data":"64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e"} Mar 13 21:35:49 crc kubenswrapper[4790]: I0313 21:35:49.998113 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lrmgj" podStartSLOduration=3.554409502 podStartE2EDuration="6.998095866s" podCreationTimestamp="2026-03-13 21:35:43 +0000 UTC" firstStartedPulling="2026-03-13 21:35:45.911749475 +0000 UTC m=+4076.932865366" lastFinishedPulling="2026-03-13 21:35:49.355435829 +0000 UTC m=+4080.376551730" observedRunningTime="2026-03-13 21:35:49.992459422 +0000 UTC m=+4081.013575323" watchObservedRunningTime="2026-03-13 21:35:49.998095866 +0000 UTC m=+4081.019211757" Mar 13 21:35:51 crc kubenswrapper[4790]: I0313 21:35:51.980835 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jpd96" event={"ID":"5652a045-40b0-4a4e-a3b1-46f376d3efb0","Type":"ContainerStarted","Data":"b16be9121eb6e8f172eb8ad37bdf2985b175f609a38d3d456673ae9ca008951f"} Mar 13 21:35:52 crc kubenswrapper[4790]: I0313 21:35:52.001345 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jpd96" podStartSLOduration=2.694463878 podStartE2EDuration="6.001325222s" podCreationTimestamp="2026-03-13 21:35:46 +0000 UTC" firstStartedPulling="2026-03-13 21:35:47.934163327 +0000 UTC m=+4078.955279218" lastFinishedPulling="2026-03-13 21:35:51.241024671 +0000 UTC m=+4082.262140562" observedRunningTime="2026-03-13 21:35:51.99794283 +0000 UTC m=+4083.019058731" watchObservedRunningTime="2026-03-13 21:35:52.001325222 +0000 UTC m=+4083.022441113" Mar 13 21:35:54 crc kubenswrapper[4790]: I0313 21:35:54.174633 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:54 crc kubenswrapper[4790]: I0313 21:35:54.175625 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:54 crc kubenswrapper[4790]: I0313 21:35:54.233700 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:54 crc kubenswrapper[4790]: E0313 21:35:54.744879 4790 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5652a045_40b0_4a4e_a3b1_46f376d3efb0.slice/crio-c9b2a5b61c86f0c2e9a1994a9a4100e00e074431ba24e075a984f642911fb803.scope\": RecentStats: unable to find data in memory cache]" Mar 13 21:35:55 crc kubenswrapper[4790]: I0313 21:35:55.050498 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:55 crc kubenswrapper[4790]: I0313 21:35:55.992101 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrmgj"] Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.023898 4790 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lrmgj" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="registry-server" containerID="cri-o://64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e" gracePeriod=2 Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.132059 4790 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.132639 4790 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jpd96" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.522131 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.684595 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-utilities\") pod \"209c0f4d-97e5-4cf0-bac5-88abc967decb\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.684963 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-catalog-content\") pod \"209c0f4d-97e5-4cf0-bac5-88abc967decb\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.685016 4790 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d7l77\" (UniqueName: \"kubernetes.io/projected/209c0f4d-97e5-4cf0-bac5-88abc967decb-kube-api-access-d7l77\") pod \"209c0f4d-97e5-4cf0-bac5-88abc967decb\" (UID: \"209c0f4d-97e5-4cf0-bac5-88abc967decb\") " Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.685487 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-utilities" (OuterVolumeSpecName: "utilities") pod "209c0f4d-97e5-4cf0-bac5-88abc967decb" (UID: "209c0f4d-97e5-4cf0-bac5-88abc967decb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.695628 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/209c0f4d-97e5-4cf0-bac5-88abc967decb-kube-api-access-d7l77" (OuterVolumeSpecName: "kube-api-access-d7l77") pod "209c0f4d-97e5-4cf0-bac5-88abc967decb" (UID: "209c0f4d-97e5-4cf0-bac5-88abc967decb"). InnerVolumeSpecName "kube-api-access-d7l77". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.740692 4790 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "209c0f4d-97e5-4cf0-bac5-88abc967decb" (UID: "209c0f4d-97e5-4cf0-bac5-88abc967decb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.786988 4790 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-utilities\") on node \"crc\" DevicePath \"\"" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.787025 4790 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/209c0f4d-97e5-4cf0-bac5-88abc967decb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 13 21:35:57 crc kubenswrapper[4790]: I0313 21:35:57.787041 4790 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d7l77\" (UniqueName: \"kubernetes.io/projected/209c0f4d-97e5-4cf0-bac5-88abc967decb-kube-api-access-d7l77\") on node \"crc\" DevicePath \"\"" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.034548 4790 generic.go:334] "Generic (PLEG): container finished" podID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerID="64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e" exitCode=0 Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.034604 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerDied","Data":"64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e"} Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.034638 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lrmgj" event={"ID":"209c0f4d-97e5-4cf0-bac5-88abc967decb","Type":"ContainerDied","Data":"d7779f370bc6b7035c8ea7ea53d41b4824922b8ee0361bd759fbee310f3449c6"} Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.034659 4790 scope.go:117] "RemoveContainer" containerID="64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.034809 4790 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lrmgj" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.068483 4790 scope.go:117] "RemoveContainer" containerID="60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.072286 4790 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lrmgj"] Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.083373 4790 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lrmgj"] Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.104115 4790 scope.go:117] "RemoveContainer" containerID="1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.182955 4790 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jpd96" podUID="5652a045-40b0-4a4e-a3b1-46f376d3efb0" containerName="registry-server" probeResult="failure" output=< Mar 13 21:35:58 crc kubenswrapper[4790]: timeout: failed to connect service ":50051" within 1s Mar 13 21:35:58 crc kubenswrapper[4790]: > Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.859164 4790 scope.go:117] "RemoveContainer" containerID="64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e" Mar 13 21:35:58 crc kubenswrapper[4790]: E0313 21:35:58.860334 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e\": container with ID starting with 64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e not found: ID does not exist" containerID="64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.860370 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e"} err="failed to get container status \"64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e\": rpc error: code = NotFound desc = could not find container \"64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e\": container with ID starting with 64ab3ff45f80e623ae43f03f506ffe91f7837caeb3a46953c25779033750106e not found: ID does not exist" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.860422 4790 scope.go:117] "RemoveContainer" containerID="60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876" Mar 13 21:35:58 crc kubenswrapper[4790]: E0313 21:35:58.860867 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876\": container with ID starting with 60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876 not found: ID does not exist" containerID="60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.860893 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876"} err="failed to get container status \"60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876\": rpc error: code = NotFound desc = could not find container \"60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876\": container with ID starting with 60848fe9bdafcf88841b9b4831c85699b007fde9994ef699da26dfa86e69d876 not found: ID does not exist" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.860912 4790 scope.go:117] "RemoveContainer" containerID="1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456" Mar 13 21:35:58 crc kubenswrapper[4790]: E0313 21:35:58.865662 4790 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456\": container with ID starting with 1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456 not found: ID does not exist" containerID="1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456" Mar 13 21:35:58 crc kubenswrapper[4790]: I0313 21:35:58.865696 4790 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456"} err="failed to get container status \"1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456\": rpc error: code = NotFound desc = could not find container \"1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456\": container with ID starting with 1e74ba1cb717e21a3d1d201bc812047df042dc7eb71ef6797cfcb56cfb0f5456 not found: ID does not exist" Mar 13 21:35:59 crc kubenswrapper[4790]: I0313 21:35:59.694302 4790 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" path="/var/lib/kubelet/pods/209c0f4d-97e5-4cf0-bac5-88abc967decb/volumes" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.149915 4790 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29557296-lfckn"] Mar 13 21:36:00 crc kubenswrapper[4790]: E0313 21:36:00.150622 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="registry-server" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.150639 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="registry-server" Mar 13 21:36:00 crc kubenswrapper[4790]: E0313 21:36:00.150651 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="extract-utilities" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.150657 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="extract-utilities" Mar 13 21:36:00 crc kubenswrapper[4790]: E0313 21:36:00.150678 4790 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="extract-content" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.150686 4790 state_mem.go:107] "Deleted CPUSet assignment" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="extract-content" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.150842 4790 memory_manager.go:354] "RemoveStaleState removing state" podUID="209c0f4d-97e5-4cf0-bac5-88abc967decb" containerName="registry-server" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.151485 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557296-lfckn" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.153556 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.153840 4790 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.153909 4790 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-cgk6x" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.161727 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557296-lfckn"] Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.286355 4790 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ghft\" (UniqueName: \"kubernetes.io/projected/fbfacc9a-83d9-474b-bd66-32171b5e688a-kube-api-access-6ghft\") pod \"auto-csr-approver-29557296-lfckn\" (UID: \"fbfacc9a-83d9-474b-bd66-32171b5e688a\") " pod="openshift-infra/auto-csr-approver-29557296-lfckn" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.388010 4790 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ghft\" (UniqueName: \"kubernetes.io/projected/fbfacc9a-83d9-474b-bd66-32171b5e688a-kube-api-access-6ghft\") pod \"auto-csr-approver-29557296-lfckn\" (UID: \"fbfacc9a-83d9-474b-bd66-32171b5e688a\") " pod="openshift-infra/auto-csr-approver-29557296-lfckn" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.410666 4790 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ghft\" (UniqueName: \"kubernetes.io/projected/fbfacc9a-83d9-474b-bd66-32171b5e688a-kube-api-access-6ghft\") pod \"auto-csr-approver-29557296-lfckn\" (UID: \"fbfacc9a-83d9-474b-bd66-32171b5e688a\") " pod="openshift-infra/auto-csr-approver-29557296-lfckn" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.470918 4790 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29557296-lfckn" Mar 13 21:36:00 crc kubenswrapper[4790]: I0313 21:36:00.938797 4790 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29557296-lfckn"] Mar 13 21:36:01 crc kubenswrapper[4790]: I0313 21:36:01.061626 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557296-lfckn" event={"ID":"fbfacc9a-83d9-474b-bd66-32171b5e688a","Type":"ContainerStarted","Data":"52827ba3011a8a7999823007c52be04ac7148669d1bc20f3d80d7aa708547a7e"} Mar 13 21:36:02 crc kubenswrapper[4790]: I0313 21:36:02.071844 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557296-lfckn" event={"ID":"fbfacc9a-83d9-474b-bd66-32171b5e688a","Type":"ContainerStarted","Data":"49c9970e251292b4ff303c24b3a9ca28ce85ada6bb14912f2c5df8bfba473cb7"} Mar 13 21:36:02 crc kubenswrapper[4790]: I0313 21:36:02.092798 4790 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29557296-lfckn" podStartSLOduration=1.254432115 podStartE2EDuration="2.092775044s" podCreationTimestamp="2026-03-13 21:36:00 +0000 UTC" firstStartedPulling="2026-03-13 21:36:00.923685883 +0000 UTC m=+4091.944801774" lastFinishedPulling="2026-03-13 21:36:01.762028812 +0000 UTC m=+4092.783144703" observedRunningTime="2026-03-13 21:36:02.08752488 +0000 UTC m=+4093.108640771" watchObservedRunningTime="2026-03-13 21:36:02.092775044 +0000 UTC m=+4093.113890925" Mar 13 21:36:03 crc kubenswrapper[4790]: I0313 21:36:03.083831 4790 generic.go:334] "Generic (PLEG): container finished" podID="fbfacc9a-83d9-474b-bd66-32171b5e688a" containerID="49c9970e251292b4ff303c24b3a9ca28ce85ada6bb14912f2c5df8bfba473cb7" exitCode=0 Mar 13 21:36:03 crc kubenswrapper[4790]: I0313 21:36:03.083988 4790 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29557296-lfckn" event={"ID":"fbfacc9a-83d9-474b-bd66-32171b5e688a","Type":"ContainerDied","Data":"49c9970e251292b4ff303c24b3a9ca28ce85ada6bb14912f2c5df8bfba473cb7"} var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515155101514024443 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015155101515017361 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015155071161016507 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015155071162015460 5ustar corecore